Just last March I asked, Am I Over NCIS? The question seems even more pressing given the NCIS season 16 finale. (Spoiler warning on the season, not to mention any and all previous seasons.) I’ve never been this mixed in my feelings regarding the characters, and the off-screen personal stuff is especially disturbing given other ugly entertainment-related realities that have been uncovered recently.
There is additional pressure from time in the saddle as well as from how viewing habits have changed (both mine and the world’s). Weekly episodes of commercial-filled broadcast TV seem increasingly quaint somehow. And sixteen seasons — most of them 24 episodes — is a lot of NCIS (378 episodes; over 260 hours).
All-in-all, for me the sun may well be setting on NCIS.
After the 2016 election I posted this picture:
Trapped in the past!
I should have known better. From where I sit, Verizon has always had something of a stench I couldn’t quite identify. There was just something about that company that rubbed me the wrong way.
Now I realize it’s because they’re a bunch of fucking assholes who don’t give two shits about their customers. And, based on my horrible, terrible, very bad experience with them (never again, never again), don’t give two shits about new customers. And I’m beginning to think all technology companies, perhaps all companies, no longer even pretend to care about their customers.
This seems just one more way we’ve seriously lost our way culturally.
I had thoughts about a second May Mandelbrot post that got a bit deeper into the weeds, but a couple attempts today went nowhere (except the trashcan). But I been having some fun exploring the Mandelbrot with Ultra Fractal, and I thought some pictures might be worth a few words.
Click on any to see bigger versions.
I realized that, if I’m going to do the Mandelbrot in May, I’d better get a move on it. This ties to the main theme of Mind in May only in being about computation — but not about computationalism or consciousness. (Other than in the subjective appreciation of its sheer beauty.)
[click for big]
I’ve heard it called “the most complex” mathematical object, but that’s a hard title to earn, let alone hold. It’s complexity does have attractive and fascinating aspects, though. For most, its visceral visual beauty puts it miles ahead of the cool intellectual poetry of Euler’s Identity (both beauties live on the same block, though).
For me, the cool thing about the Mandelbrot is that it’s a computation that can never be fully computed.
Did someone say walkies?
I’m spending the weekend dog-sitting my pal, Bentley (who seems to have fully recovered from eating a cotton towel!), while her mom follows strict Minnesota tradition by “going up north for the weekend.” So I have a nice furry end to the two-week posting marathon. Time for lots of walkies!
As a posted footnote to that marathon, this post contains various odds and ends left over from the assembly. Extra bits of this and that. And I finally found a place to tell you about a metaphor I stumbled over long ago and which I’ve found quite illustrative and fun. (It’s in my metaphor toolkit along with “Doing a Boston” and “Star Trekking It”)
It involves the idea of making a bad ROM call…
Last Friday I ended the week with some ruminations about what (higher) consciousness looks like from the outside. I end this week — and this posting mini-marathon — with some rambling ruminations about how I think consciousness seems to work on the inside.
When I say “seems to work” I don’t have any functional explanation to offer. I mean that in a far more general sense (and, of course, it’s a complete wild-ass guess on my part). Mostly I want to expand on why a precise simulation of a physical system may not produce everything the physical system does.
For me, the obvious example is laser light.
I’ve been on a post-a-day marathon for two weeks now, and I’m seeing this as the penultimate post (for now). Over the course of these, I’ve written a lot about various low-level aspects of computing, truth tables and system state, for instance. And I’ve weighed in on what I think consciousness amounts to.
How we view, interpret, or define, consciousness aside, a major point of debate involves whether machines can have the same “consciousness” properties we do. In particular, what is the role of subjective experience when it comes to us and to machines?
For me it boils down to a couple of key points.
Philosophical Zombies (of several kinds) are a favorite of consciousness philosophers. (Because who doesn’t like zombies. (Well, I don’t, but that’s another story.)) The basic idea involves beings who, by definition, [A] have higher consciousness (whatever that is) and [B] have no subjective experience.
They lie squarely at the heart of the “acts like a duck, is a duck” question about conscious behavior. And zombies of various types also pose questions about the role subjective experience plays in consciousness and why it should exist at all (the infamous “hard problem”).
So the Zombie Issue does seem central to ideas about consciousness.
In one of the more horrific examples of virtual personal enslavement in the service of philosophy, another classic conundrum of consciousness involves a woman confined for her entire life to a deep dungeon with no color and no windows to the outside. Everything is black, or white, or a shade of gray.
The enslaved misfortunate Mary has a single ray of monochromatic (artificial) light in her dreary existence: She has an electronic reader — with a black and white screen — that gives her access to all the world’s knowledge. In particular, she has studied and understands everything there is to know about color and how humans perceive it.
Then one day someone sends Mary a red rose.