----------> https://archive.ph/5FUvT
No matter how hard they try, brain scientists and cognitive psychologists will never find a copy of Beethoven’s 5th Symphony in the brain – or copies of words, pictures, grammatical rules or any other kinds of environmental stimuli. The human brain isn’t really empty, of course. But it does not contain most of the things people think it does – not even simple things such as ‘memories’.
Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused. For more than half a century now, psychologists, linguists, neuroscientists and other experts on human behaviour have been asserting that the human brain works like a computer.
To see how vacuous this idea is, consider the brains of babies. Thanks to evolution, human neonates, like the newborns of all other mammalian species, enter the world prepared to interact with it effectively. A baby’s vision is blurry, but it pays special attention to faces, and is quickly able to identify its mother’s. It prefers the sound of voices to non-speech sounds, and can distinguish one basic speech sound from another. We are, without doubt, built to make social connections.
A healthy newborn is also equipped with more than a dozen reflexes – ready-made reactions to certain stimuli that are important for its survival. It turns its head in the direction of something that brushes its cheek and then sucks whatever enters its mouth. It holds its breath when submerged in water. It grasps things placed in its hands so strongly it can nearly support its own weight. Perhaps most important, newborns come equipped with powerful learning mechanisms that allow them to change rapidly so they can interact increasingly effectively with their world, even if that world is unlike the one their distant ancestors faced.
Senses, reflexes and learning mechanisms – this is what we start with, and it is quite a lot, when you think about it. If we lacked any of these capabilities at birth, we would probably have trouble surviving.
But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.
We don’t store words or the rules that tell us how to manipulate them. We don’t create representations of visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device. We don’t retrieve information or images or words from memory registers. Computers do all of these things, but organisms do not.
Computers, quite literally, process information – numbers, letters, words, formulas, images. The information first has to be encoded into a format computers can use, which means patterns of ones and zeroes (‘bits’) organised into small chunks (‘bytes’). On my computer, each byte contains 8 bits, and a certain pattern of those bits stands for the letter d, another for the letter o, and another for the letter g. Side by side, those three bytes form the word dog. One single image – say, the photograph of my cat Henry on my desktop – is represented by a very specific pattern of a million of these bytes (‘one megabyte’), surrounded by some special characters that tell the computer to expect an image, not a word.
Computers, quite literally, move these patterns from place to place in different physical storage areas etched into electronic components. Sometimes they also copy the patterns, and sometimes they transform them in various ways – say, when we are correcting errors in a manuscript or when we are touching up a photograph. The rules computers follow for moving, copying and operating on these arrays of data are also stored inside the computer. Together, a set of rules is called a ‘program’ or an ‘algorithm’. A group of algorithms that work together to help us do something (like buy stocks or find a date online) is called an ‘application’ – what most people now call an ‘app’.
Forgive me for this introduction to computing, but I need to be clear: computers really do operate on symbolic representations of the world. They really store and retrieve. They really process. They really have physical memories. They really are guided in everything they do, without exception, by algorithms.
Humans, on the other hand, do not – never did, never will. Given this reality, why do so many scientists talk about our mental life as if we were computers?

By the standards this is applying to brains, no they don't. They're just funny little rocks and bits of metal with silly little subatomic particles doing wacky stuff. You can't take apart a computer and pull out a video or picture or anything of the sort. All the "doing useful stuff" and "having information" is a high level abstraction.
The "gotcha" of the article's author trying to get people to describe a highly abstract system that interacts with sources of information and does stuff that relates to that without using any language related to describing abstract systems that do stuff with information is just linguistic trolling. Like no shit people rely on language to talk about things, and if prevented from using any language related to a given set of abstract concepts they struggle to talk about those concepts. You'd get the same result asking a computer engineer to talk in terms of electrons, chemical engineering, and metallurgy to describe how the internet works.
He then spends the rest of it talking about holographic information storage and a process of synthesizing that back into something usable when it's recalled, smugly acting like the fact that there aren't literal jpeg files stored away in a consistent format contradicts abstract language that's used to describe observable behavior. Like he does nothing but jump between rhetorical slight of hand acts and wild non-falsifiable assertions, misrepresenting what he's arguing against while building a case that fails to even contradict his own strawman.
There was a study years ago (I'd love to see current LLMs used to tackle this idea again) where dream researchers connected brain scans of sleeping patients and connected them to snippets of YouTube videos. Eventually, they built up enough correlations to get blurry but accurate "dream videos." Let me search for it...
Well I can't find the study I meant, but newer studies in fact are using LLMs to do exactly this.
From https://greekcitytimes.com/2025/06/20/neuroscientists-can-now-turn-your-dreams-into-videos/
My point is, no the brain is not a computer. But it certainly contains some embedding of one's experiences, and that "format" can be decoded, extracted, and represented\reencoded digitally.
Exactly, like the information present is extremely lossy, it's very idiosyncratic, and at a structural level it's radically different from database lookups or deterministic and reliable file storage methods, but when we talk about the system in the abstract there is still information being shuffled around and transformed. This fuzzy, holographic, synthetic way of doing stuff is very different in practical terms and effects, but we can still say "well it is doing [inscrutable squishy meat process] to transform nerve information into actionable muscle orders for balance purposes" even if we can't convert that exact process into C++ because it's not a linear logic process.
We often say "the purpose of a system is what it does" in the context of social systems, to say that the real world results and actions of a system are its purpose, that they're what it does. While that's not a perfect 1:1 fit here, I think that same sort of standard is appropriate in this context too, that we can talk about wildly intricate and largely inscrutable systems in terms of what we see them do, in terms of the results they give. Getting lost in the weeds of neurotransmitters and scale and memory fallibility doesn't mean there's no information or that it's not being acted upon by systems that do specific sorts of fuzzy things whether innately or by learning to do them.
This is exactly what you're talking about. We do do this, for liberal definitions of the terms "buffer", "transfer", and "device" in the brain. Of course there's no "device" in there in the typical sense of the word!