Bye-Bye BOOL

Somewhere around 1990 I started designing a programming language I named BOOL (Beginner’s Object-Oriented Language). It was always a personal “ship in a bottle” project — something more for aesthetic expression than utility. Like that guy constantly working on an old car in his garage, I’ve dabbled with it ever since.

I’ve decided to, at long last, take BOOL off life support and let it die (another dead dream). But enough of dreams. I’m tired of the weight of dreams; time to shed a pointless burden. I’ve carried it for 30 years, and I think it’s time to chalk this one up to experience.

So this is a eulogy and a memorial.

[The wake lasts all week. Attendance is not required (or even particularly suggested, but fellow mourners are welcome (bring beer)). I almost posted these to my programming blog, but decided I wanted to commemorate such a notable piece of my life  here.]

BOOL was never meant to be fast, elegant, graceful, expressive, powerful, or even particularly useful. It was meant to be strange. It was meant to make people wonder “Who wrote this? And why?”

It owes to both crazy “because I can” languages like Brainfuck and Malbolge and to really cool languages like Forth, Smalltalk, and Lisp. It’s basically a compost heap of favorite bits and pieces from other languages. Along with some strange contraptions from my own fevered imagination.

Back in 2013 I created a blog, Book of BOOL, as a better way to create documentation. Over the years I’d gone from Windows Write and Paint to MS Word and MS Visio, but never liked being trapped on vendor-specific (and expensive) platforms. (I did generate PDF files of the docs.) The blog seemed an ideal way to have a living document.

But WordPress upped the ad bloat on blogs without the premium plan, and three posts in 2018, three in 2016,… my mind seems to have finally tired of BOOL, so I think I’ll take down the blog soon (probably when the domain name comes up for renewal).

If I’m honest, it’s in large part because my design goals turned out to be just too much of a pain in the ass to implement. It was doable, but… complex and messy. Since one of the design goals was a simple code base, the conflict between what I wanted and what was required to accomplish it was just too great.


It’s certainly not the first time a project turned out to be untenable (or just too much of a pain to be worth the bother (having learned what there was to learn during design)).

Long ago, when I was enamored with relay-based telephone systems, I designed and began building a relay-based eight-line PBX. I had to buy boxes of 12-volt relays (still got a bunch). The individual subsystems all worked great, but my power supply wasn’t up to driving the whole system.

Ultimately it got thrown in a box, and I abandoned the project. I was already way over budget (my projects tend to expand) and didn’t really want to get into making a super beefy power supply.

Packrat found part of his old PBX attempt in a box in the garage.

I’d learned a lot and demonstrated the pieces all worked, so the design was good. Actually building it was really just work at that point. (Besides, what was I gonna do with an eight-line PBX?)

The flip side is walking away from a such large investment of self. I spent uncounted hours fiddling with BOOL over the decades. Not completely getting across the finish line is a little disappointing.

(Sadly, one of my deeper personal flaws is being a dreamer and thinker rather than a doer and go-getter. I’m one of those idea types who needed a motivated partner. Unfortunately, I never found anyone who could put up with me for any length of time.)

§ §

Anyway, about BOOL…

The main thing, as its name implies, is that it was a object-oriented language, and a primary design goal was that everything — and I mean everything (to absurd lengths) — is an object. Even program statements are objects in BOOL.

For non-programmers, object-oriented is an approach to programming that is mainly in contrast to the procedural approach (although there are many ways to approach a programming language). The programmer always writes essentially the same code to solve a given task, but how that code is structured varies depending on the approach.

The way I think of it is that object-oriented code places nouns at the highest level whereas procedural code places the verbs at the highest level. (For those with more experience, the former elevates data, the latter elevates subroutines.)

In BOOL, that noun view extends to the code itself. An IF statement (“if X do Y”) in BOOL is an object, which is taking object-orientation to the extreme. This mainly is a borrowing from Lisp.


A major goal was that statements such as IF or WHILE are not only objects, but constructable objects within the language. Out of the box, the core of BOOL offers mostly just potential and some very basic language constructs.

The idea is that, around this is a required set of objects that one could construct using the language core. Most implementations would actually implement these objects natively for speed, but they must act as if they were built using BOOL itself.

These required objects would be the language constructs such as IF and WHILE statements.

The underlying goal was that BOOL should be as minimal as possible at its core, with a wrapper of native functionality that could be expressed in BOOL. (One interesting property of a language is the extent to which it can implement itself.)

Core BOOL has only one logic construct (called a gated list) that syntactically looks like this:

=? expression
.   statement1
.   statement2

The equals sign signifies a list. The question mark and expression indicate a gated list. (Without them, it’s just a list.) If expression evaluates to true (or non-zero), then BOOL preforms the “dotted” statements.

This is the only truly native construct. The reference implementation builds on this with various required constructs that, as mentioned, can either be natively implemented or not (the latter being inefficient and slow).

One of the required constructs is the @if action, which implements a more reasonable IF construct. It includes not only an @else clause, but also a repeatable @elseif clause, which allows compound statements.

Designing a language capable of doing that turned out to be one of the bigger hurdles. Providing the ability to implement the loop construct, such as @while, was trivial. But @if-@elseif-@else, with multiple @elseif clauses, was a real challenge.


Returning to the programming approaches, there are many terms used for the nouns (data) and verbs (code) a programmer creates.

A common term for data in object-oriented programming (OOP) is class — as in all data of a given type are instances of the same class. (In non-OOP languages data is just called data.)

In BOOL, a class is called a Model. All data are instances of some Model. The Model handles all operations on data of its type. Client code can never directly access data. (A key feature of OOP is data hiding and encapsulation.)

There is even more variation in what languages call code objects: sub-routines, procedures, functions, GOSUBs, or calls, to name a handful. In object-oriented programming, verbs are secondary and the common term is method — a class has methods that act on the instance data.

In BOOL, code objects are called Actions.

BOOL supports both generic standalone Actions (functions) and Actions that are part of a Model (methods).


The last thing I’ll mention for now is that BOOL is addicted to punctuation symbols (borrowed from Perl and APL).

For example, the @ (at-sign) means an Action, and an * (asterisk) means a Model. There are a number of others. Two of a punctuation symbol mean a definition (or declaration) where as a single means a reference to the defined thing.

For example, an *int object is an instance of a Model object named **int, and an @if object (an Actor object) invokes an Action named @@if. (Note that, in writing about BOOL, it’s rare to use the double version unless required to disambiguate meaning. For instance, normally the integer data Model is referred to as just *int.)

Which I think brings me to the point where I can show you the canonical “Hello, World!” program in BOOL. In its simplest form it looks like this:

.   print: "Hello, World!";

The first line defines an Action, named @main (which takes no parameters and returns nothing). This action has a single statement that consists of a print: message (the colon means message) and its target, the string “Hello, World!”

This statement sends the print: message (which is an object) to the string object, which sees it as a request to print itself.

This is one of the smallest possible viable BOOL programs. There must be an Action named @main for BOOL to run. It’s what starts things off. (Blame C for that, I think. Maybe it was one of its predecessors.)

§ §

As I said, this is a wake, and I have a week-long eulogy planned (and written, so there’s no avoiding it).

I realize none of you knew the deceased and have no reason for the slightest interest. (No need to send flowers.) I just don’t care to let it go without marking its passing, so it’s kinda of a big deal to me.

Stay boolean, my friends!

About Wyrd Smythe

The canonical fool on the hill watching the sunset and the rotation of the planet and thinking what he imagines are large thoughts. View all posts by Wyrd Smythe

14 responses to “Bye-Bye BOOL

  • SelfAwarePatterns

    Good way to describe OOP and procedural languages: nouns vs verbs.

    Can’t say I ever had the urge to create a programming language, although I wouldn’t have minded giving feedback to the designers of some of them.

    BOOL looks like it would have been an interesting language. Would it have been compiled or interpreted? Or do you plan to cover that later?

    • Wyrd Smythe

      Thanks! (Another metaphor! 🙂 ) It turned out to be contentious to some. I’ve had people push back surprisingly hard at the breakdown. As my dad (and many of his generation) used to say, “It just goes ta show ya.”

      There’s a funny parallel: When C++ and String Theory came out, I was a huge fan of both. Very much a convert to OOP, and ST sounded like it answered so many questions. The difference is I’m still sold on OOP. 😀

      Even so, I confess to considerable initial derision that Java required a class for the main method. It’s not like Java can claim to be 100% object-oriented. (I didn’t like Java very much when it first came out, but I grew to love it as a mature language.)

      I will get into the BOOL run-time environment, but for now: BOOL compiles to a run-time binary object — a large tree structure where every node of the tree is a program object. There’s a Run-Time Engine that executes the tree objects.

      (Pascal and Java do something similar, but they compile to PCODE and bytecode objects that are much more linear than a BOOL object tree.)

      For example, the statement set:x 42 compiles to a three-node tree fragment with the Message object (“set”) as the root. Message objects have a target child (in this case the “x” object) and a parameter child (which can be null; here it’s the “42” object).

      Things like Actions and Models compile to much larger tree fragments.

      As I’ll get into later, the RTE was intended to be very simple. It knows the first slot of every object identifies the object’s meta-type and links to its handler. It calls the handler and passes it the tree node, the handler does all the work. The RTE mainly knows about traversing the tree structure, because it also knows the meta-structure of the objects and can pick out their links to child objects. (Later in the series I describe this in a bit more detail.)

      • SelfAwarePatterns

        It took me a while to warm to OOP. It probably didn’t help that my first sustained exposure to it was C++ and MFC. I understood the Win32 api okay, despite its many flaws, but MFC’s deep inheritance hierarchies made me wonder what had been gained. Although I know writing a large Windows app was much easier with MFC.

        Everything about Java repelled me at first. It took several versions and the switch to server side before I warmed to it. Even then, I often preferred its competitors such as C#.

        But then, I remember not being wild about structured programming when I had to learn it. I can be a stick in the mud sometimes. Although I’m not nearly as bad as some. A programmer who used to work for me said he still coded the way he had in 2004 (this was in 2018) and had no interest in learning to use low code tools.

      • Wyrd Smythe

        MFC… yeah, that was a mess. I didn’t use them; I just wrote against the Windows API. (It was still a step up from all the C and 8086 coding I was doing back then. My background includes TCP drivers and interrupt handlers, so I’m used to pretty low-level coding.)

        I came at C++ through the back door. The technical journals had articles about how C++ worked under the hood (virtual methods and such), which I found fascinating. The way the cfront preprocessor converted C++ source text to C source text kinda blew me away. It meant you could do the same things in C. You could even do them in 8086. I did both. I developed an object-oriented 8086 library. 😀

        After years of 8086, C, and then C++, I (eventually) saw Java as a “friendly” language. It was nice not having to deal with memory allocation or cleanup. (As you’ll see, needing to implement GC in BOOL was one of the smaller reasons I’ve decided not to bother. It requires a reference-counting scheme that complicates the run-time.)

        “But then, I remember not being wild about structured programming when I had to learn it.”

        Ha! That’s, perhaps, one advantage of coming at programming through a CS program. Structured approaches are taught from the beginning. Two of the first languages I was introduced to in those classes were Algol and PL/I — languages designed (in part) for a structured approach.

        But, yeah, not everyone came to it that way. Or took to it. There was a Star Trek text game, written in BASIC, that was popular back then (mid-1970s). I printed out the source code on a roll of paper (it was about five feet) and used colored markers to draw lines for all the GOTO statements. A very graphic illustration of “spaghetti code”! 😀

        (I might even still have it. I should look.)

        There are also programmers who deliberately obfuscate their code as job security. It usually ends up more that someone else is left to figure out what they hell they meant.

        Engineers are even worse. Most of them are capable of writing enough code to make a project work, but there’s no structure or real sense. Almost stream of consciousness coding. I even knew one engineer — brilliant guy — who deliberately wrote code no one could read. Meaningless variable and function names. Kind of like those filters you can put your JavaScript through to try to make it unreadable as a form of copy protection.

      • SelfAwarePatterns

        I actually started out on an Atari 400 with the supplied BASIC cartridge, eventually graduating to 6502 source code. But all the earliest examples I had access to were spaghetti code. And the limited memory on those systems (16-48k) didn’t really reward it.

        But then I went to college and had a programming course, which was easy, but it was in Pascal, so it pretty much had to be structured. It worked out through, because my first programming job was maintaining a dBase system. (Where I learned that structured programming and a 4GL language was no guarantee of maintainability, although it certainly helped.)

        But while I did read about 8088 assembly and did a little bit of poking around with it, I never got into serious low level programming on the PC platform. Most of my stuff since then as been business level, or technical work in support of business functions.

        On coders who like to obfuscate, yeah, that’s why we do code reviews. Just knowing they’ll have to explain it to someone else seems to make the coding cleaner, although newbies still make the classic mistakes.

      • Wyrd Smythe

        You know, I think the Atari might be one of the very few computers I never crossed paths with. Early Apples (my cousin was into them), Commodores (I had both the 64 and 128), and lots of IBM (or other Intel) machines running MS-DOS or Windows (many of which I built from parts). The very first machine I owned was a Timex Sinclair Z80 (I still have it).

        But never an Atari. They had a great rep, though. I recall their users felt they were pretty superior machines. (I did do a lot of 6502 coding on the Commodores, though. That was fun; the 6502 is a nice simple CPU. For me that sort of thing got too complex for fun around the 386. It became true that humans couldn’t hand-write good assembly anymore without fully understanding all the caching and performance enhancing tricks of the chip. Writing a compiler became something of a Black Art.)

        It does kinda require formal training to write structured assembly. A sophisticated macro assembler makes it a bit easier (Microsoft’s MASM was pretty good that way), but one has to really understand the concepts of structured code to apply them at such a low level. Most assembly ends up being a bit spaghettified.

        “…because my first programming job was maintaining a dBase system.”

        I just watched a new Tom Scott video about when he was young and accidentally wiped out 5000 pages of a website database he was in charge of. The Worst Typo I Ever Made (The video taught me a new term: onosecond. I’d never heard that one before.)

        Talking about 808x assembly, my transition at TC from hardware guy to software guy owes a great deal to a TSR I wrote that our people in the call center could use to keep track of their time. (This was back when the call center system itself didn’t; just just did the phone call distribution.) They realized I had a whole other skill set that, at least for the next three decades, they considered valuable. (Once TC started buying off-the-rack, my perceived value plummeted with the eventual result that I retired early.)

        The work I did for TC was all over the map because it’s such a diverse company. I’ve done work for scientists and sales people, engineers and administrators. Given how much I crave variety, that worked out really well for me. Made up for the whole “corporate America” thing.

        As you say, even the best tools aren’t always enough. What’s that old saying about nothing being fool-proof when fools are so ingenious? (Kind of a cute human-focused version of Murphy.)

      • SelfAwarePatterns

        Strangely enough we got into the Atari 400/800 camp because of its video games. When my dad bought it, we were just interested in a much better version of the old Atari 2600 game console. But then as we were taking it out of he box, this BASIC cartridge and a small intro to programming book fell out…

        Of course, back in those days, nothing was compatible with anything else, and you effectively belonged to the cult of whichever brand of computer you owned. The Ataris were superior machines, at least so the cult of Atari held, but only for a couple years, until the Commodore 64 came out.

        But even before that, once we really got into it, I was painfully aware of how limited the software selection was compared to the Apple II. One of my many lessons about how the technically superior system often isn’t the winning one.

        The later Atari ST series had a really good reputation. One of my college buddies owned one, and so was in that cult. But I had moved on by then and lived in PC space, although with a much more agnostic attitude by then towards technology. I’ve always looked on with amusement with the various cults that have existed since (Mac, Linux, Java vs .Net vs PHP, etc).

      • Wyrd Smythe

        “But then as we were taking it out of he box, this BASIC cartridge and a small intro to programming book fell out…”

        Funny, those little moments that change our course forever. In some regards, your career began at that moment. For me it was deciding, for fun, to take a CS course in college.

        It’s nice you shared that with your dad. You’ve mentioned how you also shared Star Trek. In my family, I was alone in my love of science fiction, technology, and science. No one was opposed to them; they just didn’t get it and weren’t interested. My dad tried, but I think I was a bit of an alien to him. 🙂

        “One of my many lessons about how the technically superior system often isn’t the winning one.”

        So true. The world does not (necessarily) beat a path to your door.

        Have you ever read Neal Stephenson’s In the Beginning… Was the Command Line? His car dealership analogy cracks me up to this day. It never really has been about quality.

        As an aside: Apple fascinates me. So much of what they sell is just image. But the fascinating thing to me is how they (generally) back it up with quality and innovation. The relationship isn’t entirely rosy, but I have a great deal of regard and respect for Apple. (Microsoft is okay in my book, too. My unalloyed loathing is reserved for Google, Facebook, and Twitter, companies I sometimes view as being downright evil. (Oh, the irony of Google’s original motto.))

        I spent quite a few years working and programming under Unix, and that pretty much cured any allegiance I had to either Microsoft/Intel or Apple/Motorola. (Not that I’ve done much programming for Apple or Motorola chips, but we were all so aware of that battle, and lines got drawn.) After working in Unix a few years, at that time, both Windows and Macintosh seemed like toys.

        As Windows/Intel took over the business world (so much to IBM’s dismay; totally missed the boat on that one; they just wanted a smart terminal), even Unix took a back seat. Just around the time I left that Unix-based group they were switching from Sun boxes to Windows NT or XP.

        “I’ve always looked on with amusement with the various cults that have existed”

        Humans. We get so attached to ideas. It’s as hard for us to edit our beliefs as it is for writers to edit their own words.

      • SelfAwarePatterns

        I was lucky that my dad was into both science fiction and computers, and that he was far more open minded than most people from his background. He actually stayed interested in computers during a long period when my interest lapsed. (High school, girls, etc.) I miss him a lot.

        On Stephenson, is that the one where tanks are being built for free, but that the person, when presented with one, is repelled? If so, then I think I have read it.

        My attitude toward tech giants isn’t reverent or loathing. As far as I’m concerned, they all have their good points and bad. I don’t really trust any of them, especially Facebook. But I’ll use their products or services to the extent they remain useful, but won’t feel any loyalty when it fades.

        Lots of people hate Twitter, but my experience is that who you follow matters a lot. I don’t follow Trump or other clowns, but I do follow scientists, philosophers, and authors, and see some pretty interesting conversations.

      • Wyrd Smythe

        Yep, that’s the one. Windows is a station wagon; Apple is a fancy European car; Linux is a free tank; and BeOS was a Batmobile. It’s a bit out of date now, but it’s always made me laugh for being pretty right on the money. (I wonder if I could get away with posting just that bit to my programming blog. The entire piece is sort of kind of in the public domain — it was originally released that way, I believe — and the entire text is available online. OTOH, it’s also a book one can (and did 🙂 ) buy. Yeah,… maybe I’ll take a chance. I can always take it down if he objects.

        The tech tools, as we’ve talked about before, are value neutral, and obviously do have productive uses. But (as we’ve also touched on) I see the negatives as very problematic. (My focus, of course, wouldn’t be on eliminating, but fixing. That said, I’m not sure I see much value in Facebook at all.)

      • SelfAwarePatterns

        Yeah, Facebook’s only value to me is that’s where a lot of the people are. Although honestly I so rarely check my personal Facebook that it’s pointless. I only keep it around because my Facebook site, which the blog posts to, is attached to it, and it’s some people’s preferred way to follow.

      • Wyrd Smythe

        While I’ve never had a Twitter account, I used to have a FB account way back when (I deleted it many years ago). Just wasn’t for me.

  • Whither 2020 | Logos con carne

    […] I finally said good-bye to BOOL, an intentionally weird programming language I’d been designing (but never completing) for thirty years. […]

  • Brian Kernighan: Successful Language Design | The Hard-Core Coder

    […] He’s absolutely right about small languages. Doing a big one is hard to get right. […]

And what do you think?

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: