Last post I wrote about a simple substitution cipher Robert J. Sawyer used in his 2012 science fiction political thriller, Triggers. This post I’m writing about a completely different cool thing from a different book by Sawyer, The Terminal Experiment. Published in 1995, it’s one of his earlier novels. It won both a Nebula and a Hugo.
I described the story when I posted about Sawyer, and I’ll let that suffice. As with the previous post, this post isn’t about the plot or theme of the novel. It’s about a single thing mentioned in the book — something that made me think, “Oh! That would be fun to try!”
It’s about a very simple simulation of evolution using random mutations and a “most fit” filter to select a desired final result.
Back in 2014 I decided that a blog that almost no one reads wasn’t good enough, so I created a blog that no one reads, my computer programming blog, The Hard-Core Coder. (I was afraid the term “hard-core” would attract all sorts of the wrong attention, but apparently those fears were for naught. No one has ever even noticed, let alone commented. Yay?)
In the seven years since, I’ve only published 83 posts, so the lack of traffic or followers isn’t too surprising. (Lately I’ve been trying to devote more time to it.) There is also that the topic matter is usually fairly arcane.
But not always. For instance, today’s post about Unicode.
Analog computer: AKAT-1 (1959)
Last September I posted the Pancomputation trilogy (parts: I, II & III) which was a follow-up to last spring’s Digital Dualism trilogy (parts: 1, 2 & 3). The first trilogy was a continuation of an exploration of computer modeling I started in 2019. Suffice to say, over the course of writing these posts, my views on what “computing” means evolved and crystalized.
As discussed in the Pancomputation posts the notion of computation is difficult to pin down (many general concepts are because we don’t have even more general concepts to define them with). A pancomputation view sees everything as computing. A computer science view restrictively equates it with a Turing Machine.
I’ve realized my view depends heavily on computational dualism.
Oh, look! Dancing Pixies!
In the last two posts I’ve explored some ideas about what a computer is. More properly, what a computation is, since a computer is just something that does a computation. I’ve differentiated computation from calculation and, more importantly, evaluation. (This post assumes you’ve read Part I and Part II.)
I’ve also looked at pancomputationalism (the idea everything computes). The post hoc approach of mapping of random physical states to a computation seems especially empty. The idea of treating the physical dynamics of a system as a computation has more interesting and viable features.
That’s where I’ll pick things up.
Last time I began exploring what we mean by the terms “computer” or “computation.” Upon examination, these turn out to be not entirely obvious, so some resort to the edge cases: Computers are Turing Machines; or Everything is a computer.
Even then the situation remains stubbornly not obvious. Turing Machines are abstractions quite different from what we typically call computers. Saying everything computes creates such a broad umbrella that it renders the notion of computation nearly useless.
This series explores the territory between those edge cases.
Earlier this year I wrote a trilogy of posts exploring digital dualism — the notion that a (conventional) computer has a physical layer that implements a causally distinct abstract layer. In writing those posts I found my definition of computation shifting slightly to embrace the notion of that dualism.
The phrase “a (conventional) computer” needs unpacking. What is a computer, and what makes one conventional? Computer science offers a mathematical view. Philosophy, as it often does, spirals in on the topic and offers a variety of pancomputation views.
In this series I’ll explore some of those views.
This is the third post of a series exploring the duality I perceive in digital computation systems. In the first post I introduced the “mind stacks” — two parallel hierarchies of levels, one leading up to the human brain and mind, the other leading up to a digital computer and a putative computation of mind.
In the second post I began to explore in detail the level of the second stack, labeled Computer, in terms of the causal gap between the physical hardware and the abstract software. This gap, or dualism, is in sharp contrast to other physical systems that can, under a broad definition of “computation,” be said to compute something.
In this post I’ll continue, and hopefully finish, that exploration.
In the previous post I introduced the “mind stacks” — two essentially parallel hierarchies of organization (or maybe “zoom level” is a more apt term) — and the premise of a causal disconnect in the block labeled Computer. In this post I’ll pick up where I left off and discuss that disconnect in detail.
A key point involves what we mean by digital computation — as opposed to more informal, or even speculative, notions sometimes used to expand the meaning of computation. The question is whether digital computing is significantly different from these.
The goal of these posts is to demonstrate that it is.
The Age of Fire is a key milestone for a would-be technological civilization. Fire is a dividing line, a technology that gave us far more effectiveness. Fire provides heat, light, cooking, defense, fire-hardened wood and clay, and eventually metallurgy.
The Age of the Electron is another key technological milestone. Electricity provides heat and light without fire’s dangers and difficulties, it drives motors, and enables long-distance communication. It leads to an incredible array of technologies.
The Age of the Algorithm is just as much of a game-changer.
Resistance is Futile!
You will be assimilated!
Because why not? At some point one gets exhausted avoiding the Kool-Aid. (Which, for some probably neurologically depressing reason, I always type as “Kook-Aid” — or maybe it’s just a Freudian negligee. I mean slip. Underwear of some kind anyway.)
It’s a matter of not fighting an unwinnable battle. I used to use screen captures to recreate my various exquisitely customized toolbars after app updates. Exhausting. Finally, I just gave up and used the defaults.
The Kook-Aid in this case is the Microsoft Edge browser.