← Back to context

Comment by observationist

5 days ago

This is silly. It's the sum of electrical and chemical network activity in the brain. There's nothing else it can be. We've got a good enough handle on physics to know that it's not some weird quantum thing, it's not picking up radio signals from some other dimension, and it's not some sort of spirit or mystical phlogiston.

Your mind is the state of your brain as it processes information. It's a computer, in the sense that anything that processes information is a computer. It's not much like silicon chips or the synthetic computers we build, as far as specific implementation details go.

There's no scientific evidence that anything more is needed to explain everything the mind and brain does. Electrical and chemical signaling activity is sufficient. We can induce emotions, sights, sounds, smells, memories, moods, pleasure, pain, and anything you can experience through targeted stimulation of neurons in the brain. The scale of our experiments has been gross, only able to read and write from large numbers of neurons, but all the evidence is consistent.

There's not a single rigorously documented phenomenon, experiment, or any data in existence that suggests anything more than electrical and chemical signaling is needed to explain the full and wonderful and awe-inspiring phenomenon of the human mind.

It's the brain. We are self constructing software running on 2lb chunks of fancy electric meat stored in a bone vat with a sophisticated network of sensors and actuators in a wonderful biomechanical mobility platform that empowers us to interact with the world.

It explains consciousness, intelligence, qualia, and every other facet and nuance of the phenomena of mind - there's no need to tack on other explanations. It'd be like insisting that gasoline also requires the rage of fire spirits in order to ignite and power combustion engines - once you get to the point of understanding chemical combustion and expansion of gases and transfer of force, you don't need the fire spirits. They don't bring anything to the table. The scientific explanation is sufficient.

Neocortical networks, with thalamic and hippocampal system integrations, are sufficient to explain the entirety of human experience, in principle. We don't need fire spirits animating cortical stacks, or phlogiston or ether or spirit.

Could spirit exist as a distinct, separate phenomenon? Sure. It's not intrinsic to subjective experience, consciousness, and biological intelligence, though, and we should use tools of rational thinking when approaching these subjects, because a whole lot of pseudo-scientific BS gets passed as legitimate scientific and philosophical discourse without having any firm grounding in reality.

We are brains in bone vats - nothing says otherwise. Unless or until there's evidence to the contrary, let that be enough.

I think you misunderstood the person you're responding to. They did not say there was some higher force beyond the physical pieces.

What they're saying is that the brain is really really complicated and our understanding of biology is far too rudimentary right now to be saying "yes, absolutely, 100% sure that we know the nature of consciousness from this one measurement of one type of signal".

* Neurons are very complex and all have unique mutations from one another

* Hundreds of other types of cells in the brain interact with them and each other in ways we don't understand

* The various other parts of the body chemically interact with the brain in ways we don't understand yet, like the gut microbiome

Trying to flatten all of consciousness to one measurement is just not sufficient. It's like trying to simulate the entire planet as a perfect sphere of uniform density. That works OK for some things but falls apart for more complex questions.

  • I get that, but there's no need to complicate things unnecessarily.

    I'll make an even stronger claim, that biological brains are not only computers, but that they operate in binary, as well. Active and inactive - the mechanisms that trigger activation are incredibly nuanced and sophisticated, but the transfer of information through the network of biological neurons is a matter of zeroes and ones. A signal happens, or doesn't. Intensity, from a qualia persepective, ends up being a matter of frequency and spread, as opposed to level of stimulation. That, in conjunction with all sorts of models of brain function, is allowing neuroscience to make steady, plodding progress in determining the function and behavior of different neurons, networks, clusters, and types in the context in which they are found.

    All else being equal, at the rate neuroscience is proceeding, we should be able to precisely simulate a human brain, in functionally real-time, using real brain networks as models, by around 2040. We should have a handle on every facet of brain chemistry, networking, electrical signaling, and individual neuronal behavior based on a comprehensive and total taxonomy of feature types down to the molecular level.

    Figure out the underlying algorithms and you can migrate those functional structures to purely code. If you can run a mind on code, then it doesn't matter whether you're executing a sequence of computations in a meat brain, in a silicon chip, or using a billion genetically engineered notebook monkeys to painstakingly and tediously do the computations and information transfer manually, passing sheets of paper between them. ( the monkeys, of course, could not operate in real time.)

    There won't be another significant phase change, like we saw from hydraulics to computation equivalence. Computation is what it actually, physically is, at the level of electrical signals and molecular behaviors. It's just extremely complex and sophisticated and elegantly interwoven with the rest of the human organism.

    Brain gut interactions aren't necessary for human subjective experience or cognition. You could remove your brain entirely from your skull, while maintaining a equivalent level of electrical and chemical signaling from an entirely artifical platform of some sort, and as long as the interface between the biological and synthetic maintains the same signaling frequency, chemistry, and connectivity, then it doesn't matter what's on the synthetic end.

    There are independently intelligent aspects to things like the gut biome, and other complex biological systems. Those aren't necessary for brains to do what brains do, except in a supportive role. Decouple the nutrition and evolutionary drives from the mind, and you're left with a fairly small chunk of brain - something like 5B neocortical neurons is the bare minimum of what you'd need to get human level intelligence. Everything on top of that is nice to have, but not strictly necessary from a proof of concept perspective.

    • Isn't it strange that the most elusive thing in the universe - consciousness - so neatly fits the scientific framework of today? It doesn't mean that a pretty good imitation of mind can't be created in the machine world, unfortunately.

      1 reply →

There’s nothing in known physics that explains consciousness. I agree about the rest, but consciousness not only defies explanation by known physics, it’s so far beyond what’s known that there isn’t even any concept of what it could be. We barely have the ability to describe it, let alone explain it.

  • What's special about consciousness? I assume it is just a sequence of thoughts/images inside the brain and that's all? Like when a cat sees a cheese and an image of its taste appears instantly and motivates it to come closer. Human is the same, I think.

    • What’s special is qualia. Subjective experience. Thoughts and images could occur within a brain without that, and there’s no explanation for how there’s a subjective experience of those things, or even a serious notion of what is doing the experiencing, beyond vague handwaving about “consciousness.” But something is.

  • Consciousness is very interesting because if you postulate that you can't possibly create it by running a Turing machine, then anything that is simulatable can't be the mechanism behind it. Which would raise the followup question, what is? My money is on some quantum effect.

    • Or rather that the whole premise is wrong? I say, take Wittgensteinian stance on the matter, and who cares it's been 50 years.

    • You can produce some rich audio and visual effects in the form of music and movies that can be played on the turing machine that is a laptop. I think it's possible that consciousness is along those lines.

  • But is consciousness even a thing? Shouldn't the burden of proof be on the one making a claim that there exists something called consciousness? If they cannot show evidence that such a thing exists or may exist, then for all purposes it does not exist.

> Neocortical networks, with thalamic and hippocampal system integrations, are sufficient to explain the entirety of human experience, in principle.

Where did you get that? That's not an established scientific theorem, it's a philosophical stance (strong physicalist functionalism) expressed as if it were empirical fact. We cannot simulate a full human brain at the correct level of detail, record every spike and synaptic change in a living human brain and we do not have a theory that predicts which neural organizations are conscious just from first principles of physics and network topology.

> We can induce emotions, sights, sounds, smells, memories, moods, pleasure, pain, and anything you can experience through targeted stimulation of neurons in the brain

That shows dependence of experience on brain activity but dependence is not the same thing as reduction or explanation. We know certain neural patterns correlate with pain, color vision, memories, etc. we can causally influence experience by interacting with the brain.

But why any of this electrical/chemical stuff is accompanied by subjective experience instead of just being a complex zombie machine? The ability to toggle experiences by toggling neurons shows connection and that's it, it doesn't explain anything.

> We've got a good enough handle on physics to know that it's not some weird quantum thing, it's not picking up radio signals from some other dimension, and it's not some sort of spirit or mystical phlogiston.

We do have a good handle on how non conscious physical systems behave (engines, circuits, planets, whatever) But we don't have any widely accepted physical theory that derives subjective experience from physical laws. We don't know which physical/computational structures (if any) are sufficient and necessary for consciousness.

You are assuming without any evidence that current physics + it's "all computation" already gives a complete ontology of mind. So what is the consciousness? define it with physics, show me equations, you can't.

> It's a computer, in the sense that anything that processes information is a computer. It's not much like silicon chips or the synthetic computers we build, as far as specific implementation details go.

We design transformer architectures, we set the training objectives, we can inspect every weight and activation of a LLM. Yet even with all that access, tens of thousands of ML PhDs,years of work and we still don't fully understand why these models generalize the way they do, why they develop certain internal representations and how exactly particular concepts are encoded and combined.

If we struggle to interpret a ~10^11 parameter transformer whose every bit we can log and replay, it's a REAL hubris to act like we've basically got a 10^14-10^15 synapse constantly rewiring, developmentally shaped biological network to the point of confidently saying "we know there's nothing more to mind than this, case closed lol".

Our ability to observe and manipulate the brain is currently far weaker than our ability to inspect artificial nets and even those are not truly understood at a deep mechanistic concept level explanatory sense.

> Your mind is the state of your brain as it processes information.

Ok but then you have a problem, if anything that processes information is a computer, and mind is "just computation" then which computations are conscious?

Is my laptop conscious when it runs a big simulation? Is a weather model conscious? Are all supercomputers conscious by default just because they flip bits at scale?

If you say yes, you've gone to an extreme pancomputationalism that most people (including most physicalists) find extremely implausible.

If you say no, then you owe a non hand wavy criterion, what's the principled difference, in purely physical/computational terms between a conscious system (human brain) and a non conscious but still massively computational system (weather simulation, supercomputer cluster)? That criterion is exactly the kind of thing we don't have yet.

So saying "it’s just computation" without specifying which computations and why they give rise to a first person point of view leaves the fundamental question unanswered.

And one more thing your gasoline analogy is misleading, combustion never presented a "hard problem of combustion" in the sense of a first person, irreducible qualitative aspect. People had wrong physical theories, but once chemistry was in place, everything was observable from the outside.

Consciousness is different, you can know all the physical facts about a brain state and still not obviously see why it should feel like anything at all from the inside.

That's why even hardcore physicalist philosophers talk about the "explanatory gap". Whether or not you think it's ultimately bridgeable, it's not honest to say the gap is already closed and the scientific explanation is "sufficient".

  • Well said.

    We can stimulate a nerve and create the experience of pain, but stimulating the nerve does not create the memory of pain.

    Nerves triggering sensations I can understand. But stimulating the same nerves, or creating the same electrical activations does not create the memory of the pain.

    • I don't know why you single out memories.

      Certainly, if some mad scientist were to stimulate via an electrode some parts of your brain to make you experience pain, you will remember it. Also, it's not unreasonable to assume that it would be equally feasible to create fake memories by stimulating other parts.

      If there is a hard problem, memory is not it.

      2 replies →