Bad ROM Call

Bentley

Did someone say walkies?

I’m spending the weekend dog-sitting my pal, Bentley (who seems to have fully recovered from eating a cotton towel!), while her mom follows strict Minnesota tradition by “going up north for the weekend.” So I have a nice furry end to the two-week posting marathon. Time for lots of walkies!

As a posted footnote to that marathon, this post contains various odds and ends left over from the assembly. Extra bits of this and that. And I finally found a place to tell you about a metaphor I stumbled over long ago and which I’ve found quite illustrative and fun. (It’s in my metaphor toolkit along with “Doing a Boston” and “Star Trekking It”)

It involves the idea of making a bad ROM call…

Which, as with most metaphors (at least mine), got some explaining to do.[1]

But it shouldn’t be too bad; we’re all computer geeks these days, anyway, right?

BIOS ROM

So, firstly, ROM stands for Read-Only Memory. It’s a general class of computer memory (those good old ones and noughts) that, as the name says, can only be read.[2]

Secondly, the reason for read-only memory is to store things you never (or very rarely) change. Permanent things. Generally that means code — data tends to change frequently. In particular, low-level system code.

Because, thirdly, the standalone term, “ROM,” implies a certain kind of read-only memory: the kind that’s embedded in a chip that has the right bit patterns “burned” into it. Other kinds of read-only memory have other names (e.g. CD-ROM).

Fourthly, in many smaller systems, the entire O/S, or major parts of it, are burned into ROM and installed permanently on the motherboard. The ROM contains all the useful low-level things the system can do.

So, fifthly, a ROM call is when software (any software) wants to use an O/S service. It “calls” the function stored in ROM.

Finally, a “bad ROM call” is when you meant to do something not typical for you, but which has much in common with some unrelated task that is very typical for you.

My canonical example involves driving to work.

§

You do that every work day, and that pattern is “burned” into your brain — metaphorically, into your ROM.

Now imagine you’re going somewhere, a ballgame for instance, and much of the drive to the ballpark follows the same route you take to work. As you’re driving with your friends, all talking about how amazingly well the Twins are doing right now (so amazing), you suddenly realize you unconsciously drove right past the exit to the ballpark because your “inner driver” was using the ROM call for driving to work.

Of course it was. In fact, your thinking about the route to the ballpark probably involved the realization that it was the same as your drive to work. You may have visualized it as, “Go as if to work, but take this earlier exit instead.”

And in your distraction, forgot the mental flag you’d set to interrupt the “drive to work” task for the early exit.

That’s a bad ROM call.

§ §

Something similar annoys the crap out of me when I write.

Many years as a programmer mean I’ve typed the words “date” and “data” a lot. Most programming languages have some sort of date type, and working with dates is extremely common in programming.[3][4]

I’ve also used the word “data” a lot, in documentation, of course, but I use it rather frequently as a local variable name for a general clump of data.

Well, you can see the problem, right? D… A… T… [AE]

And it’s like plugging in a USB plug. Somehow it’s always the wrong way around.[5]

I get similar yips with “out” and “our” as well as “serve” and “server” — that last one drives me crazy because I rarely need the “serve” call so I always get it wrong. I type “server” every time. I’ve also noticed I often type “and” when I meant “an”.

Getting “its” and “it’s” wrong is also a bad ROM call. Likewise “they’re” and “their” and “there” — assuming you do know the difference.[6]

§ §

The irony of the whole metaphor, especially in the context of computationalism, is that computers don’t make bad ROM calls.

Computers never accidentally do anything. They don’t call a familiar ROM routine instead of a less familiar one because they’re used to the first one.

They are never absent-minded or distracted or accidentally forgetful. They don’t have the capacity to do one thing when they meant to do another.[7]

Programmers, on the other hand, do make bad ROM calls.

Programmers, because of the bad ROM calls, also make mistakes, which we fondly call “bugs” and believe will always haunt us. Which raises a whole other set of questions for AI and computationalism.

(There is an almost Gödelean absoluteness to the idea that software errors can never be fully eliminated. Even text resists proof-reading!)

As I’ve said before, perhaps our high general intelligence (and consciousness) is in a sweet spot. Maybe being messy and forgetful and slow is necessary.

It’s been said that a genuine photographic memory is a nightmare. And I think there was a House, M.D. episode about a woman with (diseased-caused) perfect recall. It was devastating to her life!

§ § §

Speaking of the stuff we’re speaking of, I noticed something striking a few years ago. I’d be interested if anyone has experienced anything similar.

It requires that I’ve been asleep, or at least had my eyes closed, for a while. Long enough for my visual system to completely shut down.

It also requires that I regain sufficient awareness before I open my eyes. For instance, if I wake up in the morning and lay there thinking before opening my eyes to face the day.[8]

And finally it seems to require a fair amount of brightness in the room. I can’t say I’ve noticed this in dimly lit situations.

What I notice is that — for the briefest of brief instants — my visual field is divided into quite prominent little squares, as if in a grid. The illusion is gone the moment I notice it, but I have noticed it time and again.

Basically, I notice it every time the conditions are as described above.

I have no idea what it means! I mention it here (A) to document it and (2) because it fits in with “weird things about how our brains work.”

Little squares. A complete grid. It’s really weird. (I like to think I’ve detected the machinery behind the virtual reality we live in. I’ve seen the Matrix, my friends!)

§ § §

I really wanted to call one of these posts Intentional States, but I couldn’t find a good hook for it. It’s kind of a trite title, anyway.

And yet, according to TMSE (The Mighty Search Engine), there is apparently no movie named Intentional States. How is that possible?

How has no one ever named their movie Intentional States.

I hereby copyright the name for my upcoming movie! Intentional States, by Wyrd Smythe! Appearing in the fantasy theatre of my mind this summer 2019!!

§ § §

And lastly, this: Physicalism doesn’t require computationalism.

One can believe the mind arises naturally from the operation of the brain without believing it’s a computation. (As laser light arises from a lasing material.)

Even a Tegmarkian could believe that consciousness, although a purely mathematical object, is something akin to a complex waveform rather than a computation. (Because computation is a remarkable and extraordinary property invented by intelligent entities and which doesn’t seem obviously to appear in nature.)

Perhaps our minds are just incredibly complex standing waves formed inside our skulls by the electrical operation of our brain. As with the complex waves that occur in a lake.

Stay complex and standing, my friends!


[1] Pro Tip: Don’t go to metaphor until you reach metathree.

[2] Well, obviously it was written once to be there at all. But that’s considered special and private, like birth. It’s sometimes done in dimly lit rooms.

[3] Dates are very probably the biggest pain-in-the-ass of any basic computer topic. All those formats and time zones… Just getting the USA right is tricky. Handling dates internationally is a nightmare.

[4] The whole Y2K thing was about dates, of course. (Well, it was really about short-sighted programmers.)

[5] No, of course not. You just remember it frustrating you more than you remember getting it right.

[6] Grammar rules: They’re there for their reasons. Knowing your shit from you’re shit. It’s the principle of its principal.

[7] They never ever walk into a room and then forget why they went there. On the other hand, they’re incredibly stupid in their own way.

[8] I keep my mattress on the floor so I metaphorically and literally have to pick myself up off the floor every morning. It’s how I know I’m still alive and kicking.

About Wyrd Smythe

The canonical fool on the hill watching the sunset and the rotation of the planet and thinking what he imagines are large thoughts. View all posts by Wyrd Smythe

7 responses to “Bad ROM Call

  • SelfAwarePatterns

    Can’t say I’ve ever experienced that grid effect. Weird.

    It does remind me of an accident I had as a boy, where I slid into a concrete wall, forehead first. For a second or two after the hit, my vision seemed distorted. It looked like the out of sync rolling effect you use to see on old CRT TVs. It only lasted a second or two, and I’ve never seen it again. (I ended up with a mild concussion. Knowing what I know now, I do wonder what brain lesions that might have left.)

    I definitely have driven to the wrong place before, either because I was in a conversation or just deep in thought. We appear to have introspective access to our planning cognition, but not necessarily to instinctive or habitual movement decisions (unless we’re currently overriding habit and instinct by planning every move as we make it, which often causes us to choke).

    It seems like the higher you go in the brain, the more plastic it is. The brainstem is relatively ROM like, although even there classical conditioning can take place. The basal ganglia seems to control learned habitual decisions. So they’re more modifiable, but only with repetition. And the cortex seems far more pliable. (Which might be due to the hippocampus rehearsing one-off memories until they’re embedded. Henry Molaison, after his hippocampi were removed, retained much of his biographical memory, but lost the ability to form new ones.)

    • Wyrd Smythe

      “It looked like the out of sync rolling effect you use to see on old CRT TVs.”

      I’ve never experienced that. Wow, and ouch!

      “We appear to have introspective access to our planning cognition,”

      That’s clearly a high-level function, but then we have abilities (like driving to work) that we seem to turn over to lower level functions. (It’s what gave me the bad ROM call idea — somehow our “execution thread” ends up in service routine we didn’t intend.)

      I’m right now trying to reprogram such a trained low-level routine. The desk chair I bought for my computer table has nice padded arms, but they don’t let me comfortably reach down to the sliding table intended for the keyboard (so your arms are more towards your lap, which is more comfortable usually). I finally realized I had to put the keyboard and mouse on the desktop (the actual physical one 🙂 ), so now I’m trying to unlearn the reflex action of sliding the keyboard table out into position as I sit down. I’ve almost got it trained out.

      “It seems like the higher you go in the brain, the more plastic it is.”

      Very much so. One reason, especially in my old age, I like new things is that it keeps my brain engaged in being plastic! (Hence wandering off to explore rotation and tesseracts and such.)

  • Philosopher Eric

    Alright Wyrd, though I’ve been focusing on Mike for a while I’m certainly quite interested in picking things up with you again. I’m going to bounce some ideas off you as time permits. I dearly hope that we can reach some effective common agreements on these matters. Surely your readers would appreciate an involved example of effective discourse if we can manage.

    I’d like to begin with a procedural matter however. Obviously the brain, through a system of input, processing, and output mechanisms, helps control the body in various ways. I currently tend to say that there is “computation” associated with this, since neurons are known to function in ways that are commonly interpreted as “and”, “or” and “not” gates. But this doesn’t mean that I consider it possible to computationally simulate brain function. It seems to me that there is far more to brain function than “just computation”. For example earlier with Mike I concluded that all input and output function of any computer must inherently occur non-computationally. Thus surely the brain doesn’t always function by means of computation alone — it should perpetually receive input and provide output during its functional life. Conversely simulations of computation do not cause lasers to light up or pain to be felt. Clearly even the idiot computers that we build do some inputting and outputting and so aren’t entirely computational. Thus their function cannot be simulation in these regards.

    What term do you currently consider most useful to reference brain function — “machine”, “computer”, or some other term? To me all that matters is that you grasp my semantics. Unfortunately syntax shall be my medium, but let’s try to work that out as well.

    • Wyrd Smythe

      “What term do you currently consider most useful to reference brain function — ‘machine’, ‘computer’, or some other term?”

      The term “brain function” itself. Everything else is a metaphor.

      Brains are, IMO, the most unique and interesting thing the universe has produced. There really is nothing to compare to it. For one thing, again IMO, brains invented computation — something I consider entirely and only intelligence-created.

      Therefore, in my view, no aspect of nature “computes.”

      Machines, likewise, are the inventions of intelligence. Calling biological systems “machines” is a fine metaphor and commonly used. I’m fine with it so long as it’s clear it’s a metaphor.

      And, really, likewise “computation” if that’s important. (But I think the conflation with actual CS-style computation is misleading.)

      Metaphors are a fine way to express an idea. It some point the discussion has to get behind the metaphors and be about specifics.

      One place to start, given what you’ve written so far, is to provide your definition of “computing” (such that the brain does it but IO doesn’t). With some possible caveats, I basically agree, but I’d like to know your precise definition.

      What constitutes a “computation”?

      [You might want to back up a post. Mike and I have an interesting discussion in Laser Light Shining Bright. This post was meant more for some fun. The discussion you’re starting isn’t really on point here. (Unless it ties to that last ¶ about Tegmark.)]

      • Philosopher Eric

        Sounds good Wyrd. Fun as it might be, no I wasn’t planning to rip into Tegmark. I actually chose this one since I planned to answer why it is that we make bad ROM calls. But then I plan to address a hell of a lot more than that as well. The laser post works fine so I’ll make a fresh start there.

        I’m good with “brain function”. Of course I’ll be beginning with very primitive creatures so I suppose that “proto brain function” is more appropriate. Actually “central organism processor” probably works even better, since here there’s no mention of “brain” at all. Neither quite roll off the tongue, but that’s fine.

        Since I thus won’t be addressing “computation” in our next discussion, there shouldn’t be much practical reason for me to define the term. But since you’ve asked, why not? You might find this relevant, or perhaps see some room for improvement.

        From my own non – CS view of things, sometimes there is input information that’s treated by means of an algorithmic processing, and this produces output information. Thus “computation”. Furthermore as I’m defining the term something like a key press input which adds information to such a system, or maybe a resulting output that throws a switch, will not be examples of “computation”. Exterior inputs and outputs will be crucial for effective function, like cellular networks, headphones, computer screens, and so on, though conceptually they’re not “computation” as I’m defining the term. Why? Because they exist beyond just “information”. Of course my iPhone is not a pure computer in this regard, since it harbors all sorts of non-computational components. To me this currently seems like the cleanest way to define the term however.

      • Wyrd Smythe

        “The laser post works fine so I’ll make a fresh start there.”

        Excellent, thank you. I’d like to reserve this one for more lighthearted fare. Weekend stuff.

        That said, your definition of computation is qualifies as an aside.

        “From my own non – CS view of things, sometimes there is input information that’s treated by means of an algorithmic processing, and this produces output information.”

        Can I ask about a precise definition of “algorithm”? In particular, do you mean in the sense of a concrete apparent algorithm in the mechanism, or the more abstract sense that ‘there is some algorithm’ that produces the outputs given the inputs?

        For example, when we found the Antikythera mechanism, it’s algorithmic function was apparent, obvious, and very concrete.

        OTOH, if we consider how DNA works, it’s a lot less obvious what or where the algorithm is.

        (FWIW, in my view, nature doesn’t do algorithms, they, like mathematics, are the product of intelligence. So if your view has algorithms in nature, I need to understand exactly how you see that.)

        “Furthermore as I’m defining the term something like a key press input which adds information to such a system, or maybe a resulting output that throws a switch, will not be examples of ‘computation’.”

        I’m on board with that. There are some interesting caveats when you get deep into the weeds, but I’ll leave that for the main discussion.

        I do absolutely agree inputs and outputs are not computation, of themselves. They are exactly what their name says they are. This is because computation is directly related to mathematics, which also has inputs (variables) and outputs (answers). Those are very similar.

        Computation without I/O isn’t very useful. Very much related to an infinite loop that does nothing.

        “Because [inputs and outputs] exist beyond just ‘information’.”

        I need clarification on that, because — as far as computation itself is concerned — I/O is just information. But certainly, as far as we’re concerned, I/O without physicality of some kind is just as useless as computation with no I/O in the first place.

        If you mean the physicality of I/O gives computation meaning, I agree.

        “Of course my iPhone is not a pure computer in this regard,”

        Nothing (well, very little, anyway) would be a “pure computer” in this regard! 🙂

And what do you think?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: