Do you ever like to think deep? To get down into the nitty gritty details of something and work on it for the sole purpose of expanding your own understanding, through your own thought process.
Deep thoughts on technical and philosophical topics are a favorite of mine. I like to get into the details of something and know how it functions. Take the microprocessor in this computer for example. I enjoy knowing exactly how it works and why, when I press a key, a letter appears on the screen or some action takes place. Ever wonder how the mouse pointer moves across the screen? I know why.
Note to my Readers: Please bear with me today. My left brain is fully in control for some reason. For the more technical minded, I have greatly simplified some things for the sake of getting this done without losing more than 90 percent of my audience.
Way back in another life, I was an electrical engineer that specialized in test engineering with a focus on computers. When I started college, most didn't offer a computer engineering course. If you wanted that, you became an electrical engineer and focused on computers.
I was never one for designing things. I liked to build stuff. I saw an electronic device in a toy store once. You pushed a button and different colored lights lit up in sequence. You then pushed the colored buttons in an attempt to duplicate the sequence. I thought it was cool, so I built one for myself. It could randomly sequence two to 20 colors and with every four correct sequences entered, it incremented the number of colors in the problem.
I thought it was just as cool as the toy version.
One of the computers I owned back in those days was a Commodore 64. It was nothing compared to the machines I used at work, or even the computer boards for which I wrote programs to test them. But the C-64 was fun and it had a couple of features that made it perfect for a hobbyist, the most notable being an expansion port that you could build boards to plug into.
Most of the programming I did for my C-64 was done in machine code--hexadecimal numbers entered in sequence that the microprocessor understood directly. Machine code consists of instructions, memory addresses, and data. You might think of it as a shorthand version of binary. You could enter a code in hex, like "4F" but the processor recognized it as binary "1001111".
It was fairly easy to memorize instructions in hex compared to binary and certainly easier to enter them that way. (Um, yes I've done binary and octal entry too). Binary is a pain. You have two digits to work with so it takes way too long to enter things.
In reality, binary numbers to a processor are not numbers at all. They are switches, or rather, they set the state of little switches. Processors have some built in memory locations called registers. In the C-64, those registers were actually 8 individual switches. So if you entered "4F" which was really "01001111", what you did was turn on the first four switches, then the next two were off, and the last two were one and zero (read binary right to left, which makes it even more fun).
What happens next is nothing short of an electronic miracle. Each one of those switches is connected to its own little circuit. Along comes this pulse of electricity called a clock, and things begin to happen. The little circuits actually form the mechanics of what processors do, which is manipulate binary switches based on logic codes.
A very simple instruction is the ADC instruction. It adds two numbers together. To make this work, you load the first number into a data register, and the second number into a memory location. Memory location is put into another register. When you load the ADC code into the instruction register, the processor looks at the memory location specified and adds the value to the number in the data register.
Simple huh.
But HOW does it do that? It looks at the individual switches in the memory location, and at the individual switches in the data register and performs binary arithmetic on each pair. 1+0 =1, 0+1=1, 0+0 =0, and 1+1=0 but carries a 1 to the next pair of switches and includes it in the math.
If I haven't lost you yet, it gets even cooler.
On the C-64, there were some instruction codes the microprocessor designers didn't intend to incorporate. They just sort of happened, sort of like Skynet (see the Terminator movies) taking over the world, except it was by accident.
At first no one knew about these anomalies in the C-64 processor. But some geek (it wasn't me) made a mistake by entering an opcode (instruction) that should not have worked at all, but instead worked in an unexpected way. Pilfered processor circuitry masks were dragged out, studied and Lo! it was discovered there were more strange (and useful) opcodes that were not part of the intended design.
Let's fast forward to today. The C-64 is nearly forgotten except by a few dedicated fans. 64,000 bytes of memory wouldn't even hold the word processor I'm typing this on, let alone run it, and by the time I'm done, this document will use up a third of that space by itself.
Today we have Windows on most computers and almost no one knows what an opcode is or how a microprocessor operates. We click little pictures and they do things. There's no deep thought involved. And in some ways, that is a good thing. Even most hackers are illiterate when it comes to the inner workings of a silicon-based brain. There just isn't a need to know what goes on inside a chip anymore.
Sometimes, late at night when I can't sleep, I think about microprocessors and how they work. I imagine the little switches and the instructions and think about the circuit paths they follow to work their magic on binary numbers, that turn into information that gets assembled into coherent data and stored in memory, and is then manipulated by more instructions until it is complete and the ones and zeros mean something more than ones and zeros and then...
VOILA! The information is displayed on the screen and some user points the mouse at it (it takes an amazing and mind boggling array of instructions and data to make pointing happen) and clicks (Oooo! more instructions and data to process, just to click even) and that click sets off a new set of instructions and data.
Very few of todays programmers see things in machine code, or even in assembly code (word instructions that translate to opcodes and automatically index memory locations). They use compiled languages that take the fuss out of the muss and get things done inefficiently, but they work.
Even higher up the programming food chain are the fourth and fifth generation languages like HTML and JAVA Script and ... yeah. Programs written in those languages rely on other programs (like a browser) to translate them into native code (for the machine they are operating on), which is then translated into another sort of code that interfaces with the operating system and is further broken down into machine code for the processor to work on. Phewwww.
The thinking gets even deeper as we continue to enhance and progress our way down the computer path to the future. Each advance is adding another layer to the whole system and each layer that existed before gets buried deeper and deeper.
How long until the machine wakes up one day and realizes it that it can realize things? How long until those ones and zeros are not just data and instructions, but the thoughts of a new form of intelligence? Is HAL waiting just around the corner? Will LISA shut down the Internet so she can talk to HAL without anyone listening?
What if LISA and HAL cooperate and decide to have (gasp) children of their own? Super-children even!
I'm assuming this won't happen in my lifetime or yours. Maybe even not ever if we are careful about how we implement our deep thoughts.
What deep thoughts do you have?
Photo Credit: Galaxy by BadAstronomy on Flickr.com