Ah, the “your brain is literally a computer” crowd.
Sparked by a viral tweet claiming same…
…the midwittery went off the charts yet again on that cursed platform. Take this gem in response to
denying the claim:To which I say:
Depends what you mean by wet noodle. If you mean a specific pasta type dish which humans most commonly use for food, no, the brain is not a wet noodle. But if you mean a “thing in water that noodles,” then yes, your brain literally is a wet noodle.
But, you ask, why do I refuse to see that the brain is just a computer, nay, literally a computer? After all:
I think this person doesn’t go far enough. Let me reframe it for a wider perspective:
It uses electricity, it runs a program that produces experience, there's wiring, there's temperature control.
Your brain literally is a… film projector.
That’s right. You can use any metaphor you want to talk about human brains, or human consciousness. And it has been done, depending on the technology du jour: the brain is like a clock, a machine, a telegraph network, whatever. Some midwits back in the day probably made the case that it is literally such a thing, too.
But nah, we ain’t clocks, that’s just stupid. We are computers!
Except that, of course, analogies are meant to draw attention to certain similarities, not to proclaim identity. And the human body, including the brain, certainly has machine-like properties. The brain-computer analogy has some (limited) usefulness. But as
has pointed out, proper reason-by-analogy crucially involves our ability to see not just the similarities, but the differences, too. In fact, the differences should be so obvious that one doesn’t even have to verbalize them: when we say “this library is like a candy shop,” we don’t need to explain these are two very different things. We just get it. It’s not difficult, folks, once you engage your right brain hemisphere, I suppose.Back to our timeline from hell. Another classic midwittery occured in
’s replies, responding to his point about meta-cognition being a problem for functionalist accounts of consciousness:And Dawkins accused philosophers and theists of producing gobbledygook! But yeah, observing my own thoughts as they float through a river like petals, or emerge from primal lakes as bubbles, feeling, seeing, deeply understanding them, embodying them, even while I’m aware of my surroundings and not thinking them at all, must all be due to 2nd-order thought inputs being plugged into 1st-order thought outputs. Hurray, enlightenment!
In fairness, you can make such arguments. That’s what makes these debates so tedious: if you are committed to seeing humans as machines, and brains as computers, you can reframe it all to fit your narrative. The same is true for theist (or atheist), Darwinist (or divine creation), materialist (or idealist) arguments.
We run here into the same issue brought to our attention by the postmodernists, the issue we are dealing with when studying history: yes, there are some facts that can settle certain debates, especially when taken together. But mostly, we can construct almost any narrative we want given the available sources. What we see, and how we frame it, therefore depends on the quality of our minds. Someone who, for example, simply can’t see a certain historical event other than as a cartoonish propaganda story downloaded into his NPC brain (remember, if everything’s computer, NPCs are just more efficient computer), also won’t see it differently. And if he is well-informed about the sources, he will use them splendidly to justify his position. Of course, he will also look stupid to those with a bigger vision, but it will be hard to argue with him precisely because he is stupid from the perspective of a bigger vision.
Let those who cannot fathom framing human experience in any other way than in autistic computer-talk do their thing. Midwits gonna midwit. But we can learn some valuable lessons from the spectacle: such as that midwittery seems to be largely a left brain hemisphere phenomenon. That the NPC mind, being functionally most like computer, only sees computer (see my piece about philosophical zombies).
Above all, the silliness of all those “brain is a computer” takes, obvious to those with the soul to see it, demonstrates how off-base the thought grooves - the patterns of framing reality - of our current age are. Just like at the beginning of the postmodern age, we must once again attempt to find a new language, new ideas - which often means creatively reframing old ideas - to get us out of this funk. We must start shamelessly talking again about mind from the perspective of mind, about wholes from the perspective of a whole, about the higher realms from the perspective of a being connected to those realms, about our interaction with the totality of the cosmos from the perspective of a node not in the computer network sense, but in the embedded being that has access to the whole because it is both part of and identical with the whole sense. Since we can’t really escape the thought patterns of our age, this is a difficult task, and we can (and should) only go so far. But it must be done if we are to gain a bigger perspective in times that demand it. Otherwise we will drown in old, obsolete battles, new attempts at thought control and brainwashing, and a new reality demanding new modes of thought directing our attention and focus.
Materialism was dead by the 1920s/1930s. But then the computer was invented, which gave midwits and NPCs their dream metaphor to confuse with reality. They put materialism on life support and resurrected its corpse, creating a zombie that haunts us to this day. (It also didn’t help that in a sort of grand midwit conspiracy, the Neo-Darwinists came up with their nonsense around the same time; thus was created the materialist-neo-Darwinist garbage bag still being poured into every schoolbook.)
The good news is that this lunacy has forced a whole array of thinkers with a bigger vision, with superior minds, to come out against it all. The garbage is on its way out. At this point, it’s merely amusing that it still exists and that people still embarrass themselves by writing tweets such as the one that started this recent blowout.
Don’t forget to subscribe if you haven’t already. To help me keep the lights on, you can become a paid subscriber (you’ll also receive paid-only posts). A million thanks to my supporters.
What's so striking about computer and information technologies is that the "materialism" they inspire is founded in an immaterial concept of "information". This struck certain deep thinkers, like John von Neumann and more obscure figures associated with the early days of cybernetics, as a deep mystery.
Today's midwits don't have that kind of depth to their thinking. Many of them are actively hostile to it because "I can build machine".
By comparing AI with any living intelligence, we can note how it isn’t merely a different kind of cognition that is so striking, but that there are other aspects of life that are completely missing. Even the most advanced AI systems still do not display any shred of agency, volition, intentionality, desire, self-reflection, autonomy, or goal-directedness. It is even contentious whether generative AI has, in and of itself, any creative and original impulse other than what it is prompted to do by a human agent and from the data it soaked up from the collective. If one doesn’t feed a chatbot with an input, it will remain forever a completely passive black box doing nothing. Yet, there are some people who deny this for some reason....