this post was submitted on 30 Apr 2026
61 points (94.2% liked)

Science

23799 readers
44 users here now

Welcome to Hexbear's science community!

Subscribe to see posts about research and scientific coverage of current events

No distasteful shitposting, pseudoscience, or COVID-19 misinformation.

founded 5 years ago
MODERATORS
 

----------> https://archive.ph/5FUvT

No matter how hard they try, brain scientists and cognitive psychologists will never find a copy of Beethoven’s 5th Symphony in the brain – or copies of words, pictures, grammatical rules or any other kinds of environmental stimuli. The human brain isn’t really empty, of course. But it does not contain most of the things people think it does – not even simple things such as ‘memories’.

Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused. For more than half a century now, psychologists, linguists, neuroscientists and other experts on human behaviour have been asserting that the human brain works like a computer.

To see how vacuous this idea is, consider the brains of babies. Thanks to evolution, human neonates, like the newborns of all other mammalian species, enter the world prepared to interact with it effectively. A baby’s vision is blurry, but it pays special attention to faces, and is quickly able to identify its mother’s. It prefers the sound of voices to non-speech sounds, and can distinguish one basic speech sound from another. We are, without doubt, built to make social connections.

A healthy newborn is also equipped with more than a dozen reflexes – ready-made reactions to certain stimuli that are important for its survival. It turns its head in the direction of something that brushes its cheek and then sucks whatever enters its mouth. It holds its breath when submerged in water. It grasps things placed in its hands so strongly it can nearly support its own weight. Perhaps most important, newborns come equipped with powerful learning mechanisms that allow them to change rapidly so they can interact increasingly effectively with their world, even if that world is unlike the one their distant ancestors faced.

Senses, reflexes and learning mechanisms – this is what we start with, and it is quite a lot, when you think about it. If we lacked any of these capabilities at birth, we would probably have trouble surviving.

But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.

We don’t store words or the rules that tell us how to manipulate them. We don’t create representations of visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device. We don’t retrieve information or images or words from memory registers. Computers do all of these things, but organisms do not.

Computers, quite literally, process information – numbers, letters, words, formulas, images. The information first has to be encoded into a format computers can use, which means patterns of ones and zeroes (‘bits’) organised into small chunks (‘bytes’). On my computer, each byte contains 8 bits, and a certain pattern of those bits stands for the letter d, another for the letter o, and another for the letter g. Side by side, those three bytes form the word dog. One single image – say, the photograph of my cat Henry on my desktop – is represented by a very specific pattern of a million of these bytes (‘one megabyte’), surrounded by some special characters that tell the computer to expect an image, not a word.

Computers, quite literally, move these patterns from place to place in different physical storage areas etched into electronic components. Sometimes they also copy the patterns, and sometimes they transform them in various ways – say, when we are correcting errors in a manuscript or when we are touching up a photograph. The rules computers follow for moving, copying and operating on these arrays of data are also stored inside the computer. Together, a set of rules is called a ‘program’ or an ‘algorithm’. A group of algorithms that work together to help us do something (like buy stocks or find a date online) is called an ‘application’ – what most people now call an ‘app’.

Forgive me for this introduction to computing, but I need to be clear: computers really do operate on symbolic representations of the world. They really store and retrieve. They really process. They really have physical memories. They really are guided in everything they do, without exception, by algorithms.

Humans, on the other hand, do not – never did, never will. Given this reality, why do so many scientists talk about our mental life as if we were computers?

top 31 comments
sorted by: hot top controversial new old
[–] mrosswind@hexbear.net 18 points 1 day ago (2 children)

This article is arguing against people who overextend a metaphor comparing brains and computers, but it's doing it by taking the opposite position to a completely absurd extreme. In the framework it describes, not only is a brain not a computer, but there is no such thing as a computer, and it is impossible for one to ever be created.

It could be justified in making a prescriptive case that people should change how we think about and talk about human brains to shift away from computer analogies, but it's written descriptively with a nonsensical portrayal of what people mean when they say "information" or "memory". Many of the things this article attacks are not computer metaphors at all, they're the way language has been used long before modern computers existed.

For a simple enough computer (four function calculator or similar) a human can perform in exactly the same way as the computer for any possible input or interaction. This means that there is at least some conceptual overlap between the behaviors of a human and a computer. Not a metaphor, literal commonality. When someone says that a calculator processed numbers, it's not a statement about the mechanics or philosophical implications of what a calculator is, it's about the role the calculator is playing in transforming inputs to outputs. If a human can take on the exact same role, there's no reason they wouldn't also be processing numbers, unless we choose to categorically exclude them.

information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers

To apply the intent of the article to this list of words, either their definitions need to be changed to explicitly say they can't be applied to humans, or they all need to be removed from the english language completely. From the examples given about brains, unless there's an arbitrary distinction, none of them would apply to any situation ever. The author seems to still want to keep using the words to describe computers, but the only justification for why the same logic used to say humans don't have memories wouldn't also apply to computers is to just state that "Computers do all of these things, but organisms do not." Sure, I guess that answers that.

I would have been interested to hear more about how specifically the computer metaphor limits our understanding, and the benefits we can gain by using the alternative of experiences and changes. The article touches on this, but spends most of its length doing a poor job of trying to prove that its framework is an objective truth about reality, rather than a lens that can be adopted or rejected.

[–] SchillMenaker@hexbear.net 7 points 1 day ago

And yet almost all the responses are people saying "YES IT IS IT'S EXACTLY LIKE A COMPUTER YOU'RE STUPID." The analogy is toxic to the future of us understanding consciousness on a mechanistic level and I'll gladly take a wildly imperfect essay if all it does is take a swing at it.

[–] Arahnya@hexbear.net 5 points 1 day ago

I would have been interested to hear more about how specifically the computer metaphor limits our understanding, and the benefits we can gain by using the alternative of experiences and changes

me too. I suppose we'll just have to do it ourselves.

One thing that I was thinking of was the belief that "trauma is stored in the hips." like... specifically just in the hips, and that exercises targetting the hips will help "release the trauma." That is not to say that physical exercises can't help you & that the body holds tension, but its more about the claim that trauma is physically and specifically in the hips, despite no evidence for this. Maybe this is related to the computer-mind metaphor.

[–] Philosoraptor@hexbear.net 32 points 1 day ago* (last edited 1 day ago)

While there are zealots who genuinely do think that the brain is literally a digital computer, I think most people would, when pressed, admit that it's an analogy. The prevailing technology of the day has a long and distinguished history of being used as a metaphor for describing thought--early modern philosophers loved to talk about mental events as a kind of clockwork mechanism, for instance--but those analogies are not generally to be taken literally. The mechanists of the 18th century didn't think that there were literally gears inside your skull; they just thought that the idea of an unimaginably intricate clockwork mechanism was a useful way to think about the functioning and organization of cognition (which it is, at least in some ways).

There's maybe more literalism about it these days than is standard, but even most of the people who take this analogy very seriously aren't saying something so trivially false as "your brain has a literal CPU and works exactly the way your laptop does." That's pretty obviously not true. But the analogy is a useful one in many ways, and can help us understand what the hell is going on in there that lets a big chunk of meat give rise to such an extraordinary phenomenon as consciousness. The observation that when I do something like add two and two in my head there must be something going on that is, in some relevant sense, functionally identical to what goes on in a desk calculator when I enter 2+2 isn't totally vapid. It's possible to take all this too seriously, and moving from "there's some amount of functional parallelism here" to "these two systems are functionally identical in general" is (I think) an unwarranted one. But, again, I don't really think that's the interesting thesis here.

Computers, quite literally, process information – numbers, letters, words, formulas, images. The information first has to be encoded into a format computers can use, which means patterns of ones and zeroes (‘bits’) organised into small chunks (‘bytes’). On my computer, each byte contains 8 bits, and a certain pattern of those bits stands for the letter d, another for the letter o, and another for the letter g. Side by side, those three bytes form the word dog.

They really don't, at least no more literally than brains do. Computers don't "know" anything about numbers, words, formulas, images, or algorithms. They don't even know anything about 1s and 0s or bits and bytes: every single one of those things is an abstraction that helps us track the real pattern in how an extraordinarily complicated system changes over time. Computers are cleverly designed arrangements of metal, plastic, and other physical material that, when subjected to certain boundary conditions, will evolve over time in predictable ways that we can then use to model various patterns. They're physical models in about the same sense that a model airplane in a wind tunnel is a physical model of a full-sized airplane in the open air--they're just more complicated by far. There's nothing magical about this kind of arrangement that makes it "real" information processing while the brain (or anything else) is ersatz; that is, information processing just is that kind of stable, predictable physical change. Computers process information in exactly the same sense that brains do and in exactly the same sense that any other physical system does. Computers (and brains) have the virtue of being a combination of complex and stable that lets them process a lot of information across a wide variety of contexts, given appropriate inputs and boundary conditions.

Consider the difference between a modern digital computer and something like Babbage's analytical engine. There are enormous physical differences between those two systems. The analytical engine was purely mechanical: it took its input via punch cards and stored its internal state via wooden or metal pegs inserted into rotating barrels. Is something like that "quite literally processing information" on this view? Is it encoding data as bits and bytes? Or is that something that only electronic digital computers can do? This strikes me as an obviously silly question: there are ways in which the analytical engine and my laptop are functionally similar, and ways in which they are different. Whether the similarities or differences are more salient depends on what you care about, or what kinds of things you think are important to track. Either they're both doing information processing, though, or neither of them is--there's nothing special about electronic digital computers that makes them "real" instances of computation and everything else just a simulacrum. But if the analytical engine can process information despite huge differences in material constitution and operation from an electronic computer, then surely the brain can as well. That doesn't mean a brain is a digital computer, just that (again) there are elements of similarity between the two, and that the formalism of information theory can be a useful lens for understanding the operation of both.

[–] SchillMenaker@hexbear.net 11 points 1 day ago

The comments here are right in line with the comments on the essay and they're all impotent rage against a fundamental truth brought up by the author. We have no fucking idea how our brains produce consciousness and our model it is woefully ill-conceived. There was one comment that really nails this idea that reads

Understanding is not possible without metaphors. But we run into trouble when we conflate the metaphor with what it’s attempting to describe. Hence our ‘insistence’ that our immune systems function like ‘search and destroy’ vessels in the military, complete with radar and intelligence functions. Or: our inability to see the economic world as anything other than a battleground rife with ‘biological’ competition (for that matter it seems to be increasingly difficult to see biology and evolution in terms other than those derived from economics). Whatever else Epstein’s argument does or doesn’t achieve, the invitation to move beyond a metaphor that has so inordinate a stranglehold on our thinking ought, surely, to be welcomed?

First of all, kind of a bummer that the author's name is Epstein, but I'll move past it. The fundamental point is that the brain does not function like a computer and attempting to build upon this false model is counterproductive. We can use computers to mimic some functions the brain does, yes, but that's not because they're mechanistically similar at all. I've talked to neurobiologists, psychologists, and neurologists about the mechanisms of consciousness whenever I meet one and every single (competent) one says the same thing, we have no fucking clue. Sure, we've gathered evidence that we didn't have before and probably know more than we used to know but on a fundamental level it's basically magic.

It is uncomfortable to not understand something so people naturally create a mental model and then stick to it in order to alleviate that discomfort. That's not how you figure new things out, that's how you get stuck.

Clearly, the brain has a mechanism for storing and processing information because we do it. We are also born with information pre-loaded. Not only does it contain, store, and add information, but it is incredibly plastic. When we remove some part of the brain that information previously flowed through, the brain has an incredible ability to use alternate pathways to generate that equivalent information. If the brain operated through set algorithms and patterns that would be impossible. Imagine if you could unplug your hard drive but since the computer needs the information on it you could osmose certain things through the silicone and onto the motherboard anyway. That doesn't make any sense with computers but your brain does it. Somehow information is communicated not only by the pattern of action potentials, but also the frequency and strength of those action potentials. If that's like a computer, it's a kind of computer that is nothing like the computers that exist and is so much more complicated as to be a fundamentally different thing.

[–] TiredDinoByte@lemmy.today 5 points 1 day ago

As a neuroscience major turned software engineer I think the main problem with the metaphor is that most people neither understand how a computer nor how a brain work.

[–] KobaCumTribute@hexbear.net 28 points 1 day ago (2 children)

Computers, quite literally, process information – numbers, letters, words, formulas, images.

By the standards this is applying to brains, no they don't. They're just funny little rocks and bits of metal with silly little subatomic particles doing wacky stuff. You can't take apart a computer and pull out a video or picture or anything of the sort. All the "doing useful stuff" and "having information" is a high level abstraction.

The "gotcha" of the article's author trying to get people to describe a highly abstract system that interacts with sources of information and does stuff that relates to that without using any language related to describing abstract systems that do stuff with information is just linguistic trolling. Like no shit people rely on language to talk about things, and if prevented from using any language related to a given set of abstract concepts they struggle to talk about those concepts. You'd get the same result asking a computer engineer to talk in terms of electrons, chemical engineering, and metallurgy to describe how the internet works.

He then spends the rest of it talking about holographic information storage and a process of synthesizing that back into something usable when it's recalled, smugly acting like the fact that there aren't literal jpeg files stored away in a consistent format contradicts abstract language that's used to describe observable behavior. Like he does nothing but jump between rhetorical slight of hand acts and wild non-falsifiable assertions, misrepresenting what he's arguing against while building a case that fails to even contradict his own strawman.

[–] codexarcanum@lemmy.dbzer0.com 19 points 1 day ago (1 children)

There was a study years ago (I'd love to see current LLMs used to tackle this idea again) where dream researchers connected brain scans of sleeping patients and connected them to snippets of YouTube videos. Eventually, they built up enough correlations to get blurry but accurate "dream videos." Let me search for it...

Well I can't find the study I meant, but newer studies in fact are using LLMs to do exactly this.

The research, led by cognitive neuroscientist Professor Yukiyasu Kamitani, involved participants sleeping in MRI machines while their brain activity was continuously monitored. During rapid eye movement (REM) sleep—the stage where most vivid dreaming occurs—participants were intermittently woken and asked to describe what they had just dreamed.

Their verbal accounts were then used to train an AI model to associate patterns of brain activity with specific imagery. Eventually, the AI was able to predict what participants were dreaming about with remarkable consistency—generating rough, yet recognisable visuals of faces, animals, landscapes and even text.

From https://greekcitytimes.com/2025/06/20/neuroscientists-can-now-turn-your-dreams-into-videos/

My point is, no the brain is not a computer. But it certainly contains some embedding of one's experiences, and that "format" can be decoded, extracted, and represented\reencoded digitally.

[–] KobaCumTribute@hexbear.net 15 points 1 day ago

Exactly, like the information present is extremely lossy, it's very idiosyncratic, and at a structural level it's radically different from database lookups or deterministic and reliable file storage methods, but when we talk about the system in the abstract there is still information being shuffled around and transformed. This fuzzy, holographic, synthetic way of doing stuff is very different in practical terms and effects, but we can still say "well it is doing [inscrutable squishy meat process] to transform nerve information into actionable muscle orders for balance purposes" even if we can't convert that exact process into C++ because it's not a linear logic process.

We often say "the purpose of a system is what it does" in the context of social systems, to say that the real world results and actions of a system are its purpose, that they're what it does. While that's not a perfect 1:1 fit here, I think that same sort of standard is appropriate in this context too, that we can talk about wildly intricate and largely inscrutable systems in terms of what we see them do, in terms of the results they give. Getting lost in the weeds of neurotransmitters and scale and memory fallibility doesn't mean there's no information or that it's not being acted upon by systems that do specific sorts of fuzzy things whether innately or by learning to do them.

[–] Abracadaniel@hexbear.net 11 points 1 day ago

We don’t create representations of visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device.

This is exactly what you're talking about. We do do this, for liberal definitions of the terms "buffer", "transfer", and "device" in the brain. Of course there's no "device" in there in the typical sense of the word!

[–] Camden28@hexbear.net 9 points 1 day ago

We don’t store words or the rules that tell us how to manipulate them.

  • "i before e except after c"
  • "believe contains a LIE"
  • hear/here they're their there
  • spectral colours = ROY G. BiV (if you include indigo)
  • Great Lakes = HOMES
  • "negative b, plus or minus the square root of: b squared minus 4ac, all over 2a"
  • "dy/du du/dx"

I'm not saying we know how that information is stored, but without seeking the information, those rules, spellings, and how to use them automatically pop up into my head without delay. I understand subject/verb agreement, and while I often fail stick to proper use of tenses, I can usually catch my own failures when re-reading what I've written.

I can't draw a dollar bill. I can't because human brains are great at pruning. We purposefully don't keep a record for most of everything we experience. We are highly lossy from the outset, and delete details over time. To put a computer in the same position, if I hook up a camera and microphone to my computer but don't actually record anything, the computer is not going to replicate a dollar bill, either. If I take a jpeg or make a short highly compressed mp4 of a dollar bill as seen through fumbling fingers at the check out, that image also won't have much detail for a computer to copy, and a generic computer would be unlikely to even know the bill has a rectangular shape. Sure, you could write a program specifically to figure out the planar dimensions of such object, but that is a lot of work for partial recognition.

Also, the definition of computer has changed over time. It used to be a profession for humans good at math. Now we only think of those boxes of equipment that require an operating system, supporting software, peripherals, and electric current. More than that, I don't know who thinks brains are at all like computers given computers are completely lacking emotional responses. Yes, I know you can set up 'points' to get them to favor one result or another, but they don't get frustrated and rage quit if they can't score. Yeah, they seg fault and blue screen, but not out of frustration, or love, or boredom or from distractions. Computers and people have different designs for different functions. Can we leave it at that?

[–] PaulSmackage@hexbear.net 10 points 1 day ago

My brain is actually powered by two 6l6 vacuum tubes and has a built in tape deck

[–] postscarce@lemmy.dbzer0.com 8 points 1 day ago (1 children)

A healthy newborn is also equipped with more than a dozen reflexes – ready-made reactions to certain stimuli that are important for its survival. It turns its head in the direction of something that brushes its cheek and then sucks whatever enters its mouth. It holds its breath when submerged in water. It grasps things placed in its hands so strongly it can nearly support its own weight. Perhaps most important, newborns come equipped with powerful learning mechanisms that allow them to change rapidly so they can interact increasingly effectively with their world, even if that world is unlike the one their distant ancestors faced.

These are all examples of algorithms. There are very obvious differences between a brain and a computer but they both also have huge similarities, which is why the analogy is so easy and widespread. When we make an analogy we are not saying ‘X is Y’, we are saying that in some ways X seems like Y, and that is often enough for us to build a better understanding of one or both of those things.

A plane is like a bird; both fly because their wings are shaped such that there is higher air pressure on the underside, but planes don’t flap their wings and they aren’t made of flesh. The analogy can be taken too far but that doesn’t mean that it isn’t useful within specific constraints.

So you are right that a brain is not a computer, but that doesn’t mean that;

  1. the analogy can’t be useful to help us to understand either system
  2. a brain could not be fully simulated in a computer (I’m not saying it could, there may be fundamental reasons why it can’t, but the fact that they are made of different materials and process information in different ways is not sufficient)
  3. we could never read a memory from a brain (similar to 2, I’m not saying we definitely could, but it may be possible, and in fact there has been interesting work in reconstructing visible images and abstract ideas from brain scans of volunteers already)
[–] Dessa@hexbear.net 9 points 1 day ago

Yeah, and the article says we don't store memories, but the concept of memory predates digital computing by probably hundreds of thousands of years. The word predates it by several thousand at least. Maybe it's that we describe computers using human terms, not the other way around

[–] MLRL_Commie@hexbear.net 11 points 1 day ago

Honestly this was a really good analytical analysis of how the basic processes can't be mapped except by analogy. This doesn't undermine the arguments perfectly (2 basic processes can form a system with identical inputs and outputs as a different system), but it definitely clarifies how many people think about brains can't be correct

[–] MnemonicBump@lemmy.dbzer0.com 5 points 1 day ago (1 children)

Did you miss the definition of an analogy?

[–] Arahnya@hexbear.net 3 points 1 day ago (2 children)

at what cost have we used the analogy? great? little? of no consequence? 🧐

[–] SchillMenaker@hexbear.net 3 points 1 day ago (1 children)

An analogy is like a thing that's like a different thing but it's not actually that thing. So when I stole your wallet I think the best way to describe it is that it's like when your job direct deposits money into your account on pay day. Does that help you understand?

[–] Arahnya@hexbear.net 1 points 1 day ago (1 children)
[–] SchillMenaker@hexbear.net 1 points 1 day ago (1 children)

I'm using sarcasm to create a rhetorical analogy in which the things are not similar but since I'm calling it an analogy I am automatically correct, much in the way that the analogy of a brain to a processor and microcode must be correct because we're calling it an analogy.

So to answer your question I'm not talking to anybody I'm shouting into an endless void.

[–] MnemonicBump@lemmy.dbzer0.com 2 points 1 day ago* (last edited 1 day ago) (1 children)

??? Analogies cost nothing. They're free

Edit: I'm dumb, I get it.

I would say to very little consequence. The average person doesn't need to know the detailed workings of the human brain, they just need to know that the brain is the thinking part.

[–] Arahnya@hexbear.net 1 points 1 day ago

that's okay! I am more interested in hearing your reasoning anyways. 😸

[–] insurgentrat@hexbear.net 10 points 1 day ago* (last edited 1 day ago)

Really well put article, I had seen that sketch exercise incidentally before but never really connected it to a demonstration that memory is not storing records of a thing.

Given the experience of how memory is cued I wonder if it's better thought of as a way of transforming sense data (ahh i'm doing it!) into experience shaped things. Like when you try and recall idk a beachball are you using beachball recognising networks to try and make your current experience more beachball-like? I'm off the deep end with this speculation though, cool stuff.

[–] mr_sunburn@hexbear.net 7 points 1 day ago (1 children)

The "brain is a computer" model is a useful metaphor for understanding some parts of how the brain works. There are those who think there's a stronger computational theory of mind, but mostly people who study this professionally would concede the instinctual and language points you've identified as matters of fact.

What do you see as the dangers of over-reliance on this metaphor? Dehumanization? Some sort of spiritual dimension?

[–] InexplicableLunchFiend@hexbear.net 9 points 1 day ago* (last edited 1 day ago) (1 children)

Techno-utopianism and obsession with "digitization" of oneself for eternal life. A fruitless pursuit of hellish torture technology that will not duplicate actual humans, but risks creating something dangerous or unethical instead. Musk-esque brain chip obsession, the pursuit of medical and surveillance technologies that attempt to monitor or merge the human mind with computational devices.

Tons of wasted resources on a dead-end "fountain of youth" that will just result in mass torture of monkeys and human test subjects

[–] mr_sunburn@hexbear.net 1 points 1 day ago

Ah, in that case, I have a solution: simply end the existence of Technocrats.

[–] TheSovietOnion@hexbear.net 2 points 1 day ago

What do you guys think of Deleuze & Guatarri's views on how the brain works in the conclusion for "What is Philosophy"?

[–] tamagotchicowboy@hexbear.net 5 points 1 day ago (1 children)

Because if you don't hype then no funding for you, also the whole oversimplification of ideas and reference in order to get them across. Yea computing isn't great, but there's also the longstanding issue of dichotomy your brain is a computer or it isn't as shown here, that's not a very good material analysis do better bear website, you don't need to know anything of neurosci to pick up on that. Before there was computing to mess with our metacognition and study or attempt to understand our own brains there was (and is) religion, spirituality, the occult etc. Where do you think the name for the cortical homunculus came from?

This piece forgets the brain and nervous system is embodied in meat and is made of such, you can manipulate meat. Yes, you can't store things like a USB, brain's best conjecture is messy and lossy, but you can certainly recreate responses in say one sea slug to the next with some careful adjustment, then TBI or diseases causes prefrontal or hippocampal lesions or something would certainly put to test that whole concept of memory not being stored anywhere for even 5s, what's memory consolidation? Its not happening in the ether I hate to say.

Then let's not forget basic reflexes. I'll die on the pattern recognition hill (no one dies on that hill, its fortified), and then things like retinal ganglia totally do "process" edges in vision or to continue on with vision seemingly specialist areas of the brain like the fusiform face area does something to modulate our interpretation of individual types of things.

[–] infuziSporg@hexbear.net 2 points 1 day ago

you can certainly recreate responses in say one sea slug to the next with some careful adjustment

Was that the organism where they blended it up and then reinjected the cellular material, and the new organism responded to the previously trained stimuli?

JP from Grandma's boy in absolute shambles.