We started with the idea of code — data consisting of instructions in a special language. Code can express an algorithm, a process consisting of instruction steps. That implies an engine that understands the code language and executes the steps in the code.
Last time we started with Turing Machines, the abstract computers that describe algorithms, and ended with the concrete idea of modern digital computers using stored-programs and built on the Von Neumann architecture.
Today we look into that architecture a bit…
The ship sailed when I was moved to rant about cable news, but I originally had some idea that Sideband #32 should be another rumination on bits and binary (like Sidebands #25 and #28). After all, 32-bit systems are the common currency these days, and 32 bits jumps you from the toy computer world to the real computer world. Unicode, for example, although it is not technically a “32-bit standard,” fits most naturally in a 32-bit architecture.
When you go from 16-bit systems to 32-bit systems, your counting ability leaps from 64 K (65,536 to be precise) to 4 gig (full precision version: 4,294,967,296). This is what makes 16-bit systems “toys” (although some are plenty sophisticated). Numbers no bigger than 65 thousand (half that if you want plus and minus numbers) just don’t cut very far.
I’d planned to do this later, probably for Sideband #64, but in honor of my parents 64th wedding anniversary (2 parents, 64 years, okay!) this numerical rumination gets queue-bumped to now.
Just recently I wrote about 64-bit numbers and how 64 bits allows you to count to the (small, compared to where we’re going) number:
264 = 18,446,744,073,709,551,616
That’s 18 exabytes (or 18 giga-gigabyes). Just to put it into perspective, if we were counting seconds, it amounts to 584,942,417,355 years; more than 500 billion years! (That’s the American, short-scale billion.)