Comment by pedalpete
5 days ago
I believe that training a system to understand the electrical signals that define a movement is significantly different from a system that understands thought.
I work in neurotech, I don't believe that the electrical signals of the brain define thought or memory.
When humans understood hydro-dynamics, we applied that understanding to the body and thought we had it all figured out. The heart pumped blood, which brought nutients to the organs, etc etc.
When humans discovered electricity, we slapped ourselves on the forehead and exclaimed "of course!! it's electric" and we have now applied that understanding on top of our previous understanding.
But we still don't know what consciousness or thought is, and the idea that it is a bunch of electrical impulses is not quite proven.
There are electrical firing of neurons, absolutely, but do they directly define thought?
I'm happy to say we don't know, and that "mind-reading" devices are yet un-proven.
A few start-ups are doing things like showing people images while reading brain activity and then trying to understand what areas of the brain "light-up" on certain images, but I think this path will prove to be fruitless in understanding thought and how the mind works.
Agree completely. The brain is so incredibly complex that we've barely scratched the surface. It's not just neurons, which are very complex and vary wildly in genetics between them - it's hundreds of other helper cells all interacting with each other in sometimes bizarre ways.
To try to boil down it all to any simple signal is just never going to work. If we want to map consciousness it's going to be as complex as simulating it ourselves, creating something as dense and detailed as a real brain.
I don't think it's anything other than electric activity, but it's clearly not "some electrical signal". It's the totality of them. They are many, and complicated. And they seem to be required for consciousness. Doubt there's any proven conscious state in a human, lacking electrical activity in the brain.
Re just electrical activity, I think you can add empirical evidence it's chemical as well as beer and other substances can affect your perception of reality.
1 reply →
We know that the brain is a structure that works through electrochemical reactions. Synapses transmit signals sent by axons to neurons. We can test this. We can measure it. There's nothing else going on that we can describe using known science.
Ah, we might say, maybe there is an unknown science - we did not know about so much before, like electricity, like X-Rays, like quantum physics, and then we did, and the World changed.
The difference is that we observed something that science could not explain, and then we found the new science that explained it, and a new science was born.
It's pretty clear to me - but you may know more - that we can explain all brain activity through known science. It might be hard to think of us as nothing more than a bunch of electrochemical reactions in a real-world reinforcement learning system, but that's what we are: there's no gap that needs new science, is there?
Scalp-recorded EEG does not measure action potentials, it can only measure the graded potentials of basically one type of neuron (pyramidal cells) in the cortex, which is a really tiny percentage of both neurons and electrical activity in the brain. Additionally, there is also the various roles neurotransmitters play in the brain, etc., and glial cells seem to also play an important role. So, it’s definitely not the case that there aren’t any gaps that need new science, and even if there weren’t, it’s a pretty big stretch from there to decoding all brain activity solely through the electrical component.
You're probably right, but that doesn't mean GP is wrong, just that they need to state their thesis more carefully. There's more science to be done there, but there's no reason to believe that new fundamental laws of physics are required to explain the brain.
It seems neatly organized to say "that we can explain all brain activity" and yet not necessarily bound exactly what is "brain activity." I think prior to recent research [1] people would have concluded that memory was solely the domain of the brain. But that sense/setting/environment would allow Clive Wearing to circumvent amnesia to access skills otherwise unavailable to his conscious mind [2] should raise questions of that understanding.
[1] https://www.nyu.edu/about/news-publications/news/2024/novemb...
[2] https://en.wikipedia.org/wiki/Clive_Wearing
Clive Wearing is some SCP-level nightmare fuel but at least he apparently isn't in constant distress each time.
No, none of this is settled. We cannot adequately explain brain function with current science.
There have been studies this year implying that some brain functions rely on quantum interactions.
> There have been studies this year implying that some brain functions rely on quantum interactions.
I'm not in the camp of "the brain is solved science", but come on. All of chemistry depends on quantum interactions and that didn't stop us from understanding a lot of it. It just means we're solving a different set of equations.
It's a big leap to go from "brain electrochemistry is described by quantum mechanics" to "the essence of consciousness / human soul is hiding in the quantum realm and therefore can't be measured or replicated".
1 reply →
> We know that the brain is a structure that works through electrochemical reactions. Synapses transmit signals sent by axons to neurons. We can test this. We can measure it. There's nothing else going on that we can describe using known science
But what we can describe using known science doesn't describe the system. That doesn't mean the vacuum is voodoo. It's just a strong hint something more is going on. (Like the photoelectric effect.)
We know more about dark energy and matter than the dark essence that separates our leading electrochemical models from consciousness.
Can we? We can only see whatever we can measure with the tools we currently have, which are based on the knowledge we currently have. Who's to say there isn't something out there we haven't discovered yet? There's more than enough we still don't understand in many domains of science
> Who's to say there isn't something out there we haven't discovered yet
Occam's razor? We should work with as few assumptions as possible to get a model with the largest scope. Otherwise we get stuck with a hard to falsify mess.
4 replies →
considering our current physics can explain the vast majority of physical interactions in the universe, it just seems unlikely that there is some new fundamental force in the universe that helps explain how the brain works yet has no perceivable effect on the rest of the physical material of the universe.
1 reply →
I think there is new science we need first. The brain very likely uses quantum processes. We don't understand quantum mechanics yet.
[dead]
> There are electrical firing of neurons, absolutely, but do they directly define thought?
Well, surgeons and researchers have shown that electrical stimulation of certain brain regions, can induce "perception" during procedures. They can make a patient have the conscious experience of certain smells, for instance.
It's not conclusive proof of anything, but I wouldn't bet against us getting closer to the mark, than we were when we only considered hydro-dynamics as the model.
> surgeons and researchers have shown that electrical stimulation of certain brain regions, can induce "perception" during procedures
I can carefully drop liquid reactants on a storage medium and induce nontrivial and reproducible changes in any computer reading it. That doesn't tell me how digital storage works, it just says I'm proximate to the process.
Indeed, that's why I said it wasn't conclusive proof of anything. But it also seems like a bit of wishful thinking, that it isn't more than proximate. There seems to be a strong psychological need for people to feel like more than their biology. We have centuries of faith and myth that put us as central characters, and heaven-bound spirits.
It goes far beyond smells, in ways I find deeply unsettling
We can induce religious experience, see "The God Helmet"
https://en.wikipedia.org/wiki/God_helmet
or deep depression & suicidal thoughts
https://www.nejm.org/doi/full/10.1056/NEJM199905133401905
The Wikipedia page seems to show that it's just a placebo/scam? The helmet has the equivalent magnetic effect of a fridge magnet or hair dryer, and researchers in Sweden replicating the research in a double-blind study found no effect at all. Looking at the pictures on the official website https://www.god-helmet.com/wp/god-helmet/index.htm , it's just magnets on a snowmobile helmet this guy bought.
That god helmet article seems to thoroughly debunk it as snake oil quackery, which is indeed deeply disturbing
Thhe "God helmet" was likely a placebo device. From the very same Wikipedia article you linked:
> Other groups have subsequently found that individual differences such as strong belief in the paranormal and magical ideation predict some alterations in consciousness and reported "exceptional experiences" when Persinger et al's experimental set-up and procedure are reproduced, but with a sham "God helmet" that is completely inert or a helmet that is turned off.
> I don't believe that the electrical signals of the brain define thought or memory.
Yes and no. It'll be something like a JPEG file. You can have a JPEG file that contains an image of a cat. But give that file to someone who has no clue about JPEG encoding and the file looks like random noise. They'll take 100 years to figure out it's an image of a cat.
Actually it's like if you take an electron beam prober to one of the NVidia AI GPU chips while it's figuring out whether it likes Wordsworth poetry.
You say you don’t believe something is true and then say you don’t know, but I’ll disagree with “electrical signals don’t “define” (encode) thoughts.
To be clear, of course it’s true that our thoughts are more than just electrical activity. The brain is a system. However, it seems clear that thoughts are at least partially encoded in electrical activity.
What you mentioned those startups will find fruitless, that’s already been done for years in a research setting. It may not be a successful business model, but it’s already been demonstrated.
There are fMRI studies and electrical measurement studies. You could argue fMRI decoding of images is not electrical activity which is true, but a bunch or work shows they are strongly correlated.
For electrical activity alone we’re already decoding information like words, so it’s hard to claim electrical activity doesn’t define thoughts.
Maybe you mean to say, doesn’t define all the content of our thoughts which is a much different claim.
Well, if you are making the assertion, which you implicitly seem to be, you must first define thought. Is a word == thought? And of correlations, we all know the adage about correlation and causation. Not that I would make the counter argument, that thought is not encoded by electrical signals, but I would bet you aren’t totally correct. Do you think there will be no future paradigm shifts?
I think we may not be disagreeing much…
Agree fMRI doesn’t tell us for sure what information is in electrical activity. I only mean to imply it’s suggestive.
But we still can get a lot of information from electrodes. Do words count as thoughts? I’d say so.
Definitely don’t think electrical signals are all there is to thought.
They’re definitely part of it, but I think time will show unlike with computer parts, it’s hard to make clean separation of responsibilities in the brain.
For me it's like attaching wires to CPU and trying to decipher what youtube video is being playing right now.
Absolutely not possible.
That is such a great analogy!
Another I heard is that measuring EEG is like standing outside a stadium during a match and listening to the roar of the crowd.
Reading thoughts through EEG is like standing outside the stadium, listening for the roar of the crowd, and based on what you hear, knowing what the umpire's mother-in-law had for breakfast.
One thing is probably true: You have to train on the individual person, and it’s not transferable to a different person. Similar to how when taking an LLM and training on the fluctuations of its neural network to “read its thoughts”, the training results won’t transfer to interpreting the semantic contents of the network activity of a different LLM.
So you probably can’t build a universal mind-reading device.
You can't build a universal mind-reading device that doesn't require calibration.
And when you can build one, you will also get telepathy, telekinesis, clairaudience and clairvoyance. I can't wait until I can send a directed thought to someone else.
1 reply →
This sounds logical and convincing.
At the same time, it should also be easy to falsify.
Has there been an experimental setup like this tested? If I’m not mistaken it should falsify your claim.
Train a decoder on rich neural recordings, then test it on entirely new thoughts chosen under blinded conditions.
If it can still recover the precise unseen content from signals alone, the claim that electrical activity is insufficient is overturned.
> Train a decoder on rich neural recordings, then test it on entirely new thoughts chosen under blinded conditions.
There have been enough studies about this and the result is mostly the same: it's difficult to nearly impossible to reliable decode neural recordings that differ from the distribution of neural recordings that the decoder was trained on. There are a lot of reasons why this happens, electrical activity being insufficient is not one of them.
seems like trying to take a single pixel signal (so to speak) and interpolate entire image out of it.
I thought it was pretty established by now that it is likely that other parts of the body participate in both memory and thought, a fully distributed system?
Does it make sense to think of thoughts, consciousness etc. as an emergent property of the neuronal activity in our brains?
This is silly. It's the sum of electrical and chemical network activity in the brain. There's nothing else it can be. We've got a good enough handle on physics to know that it's not some weird quantum thing, it's not picking up radio signals from some other dimension, and it's not some sort of spirit or mystical phlogiston.
Your mind is the state of your brain as it processes information. It's a computer, in the sense that anything that processes information is a computer. It's not much like silicon chips or the synthetic computers we build, as far as specific implementation details go.
There's no scientific evidence that anything more is needed to explain everything the mind and brain does. Electrical and chemical signaling activity is sufficient. We can induce emotions, sights, sounds, smells, memories, moods, pleasure, pain, and anything you can experience through targeted stimulation of neurons in the brain. The scale of our experiments has been gross, only able to read and write from large numbers of neurons, but all the evidence is consistent.
There's not a single rigorously documented phenomenon, experiment, or any data in existence that suggests anything more than electrical and chemical signaling is needed to explain the full and wonderful and awe-inspiring phenomenon of the human mind.
It's the brain. We are self constructing software running on 2lb chunks of fancy electric meat stored in a bone vat with a sophisticated network of sensors and actuators in a wonderful biomechanical mobility platform that empowers us to interact with the world.
It explains consciousness, intelligence, qualia, and every other facet and nuance of the phenomena of mind - there's no need to tack on other explanations. It'd be like insisting that gasoline also requires the rage of fire spirits in order to ignite and power combustion engines - once you get to the point of understanding chemical combustion and expansion of gases and transfer of force, you don't need the fire spirits. They don't bring anything to the table. The scientific explanation is sufficient.
Neocortical networks, with thalamic and hippocampal system integrations, are sufficient to explain the entirety of human experience, in principle. We don't need fire spirits animating cortical stacks, or phlogiston or ether or spirit.
Could spirit exist as a distinct, separate phenomenon? Sure. It's not intrinsic to subjective experience, consciousness, and biological intelligence, though, and we should use tools of rational thinking when approaching these subjects, because a whole lot of pseudo-scientific BS gets passed as legitimate scientific and philosophical discourse without having any firm grounding in reality.
We are brains in bone vats - nothing says otherwise. Unless or until there's evidence to the contrary, let that be enough.
I think you misunderstood the person you're responding to. They did not say there was some higher force beyond the physical pieces.
What they're saying is that the brain is really really complicated and our understanding of biology is far too rudimentary right now to be saying "yes, absolutely, 100% sure that we know the nature of consciousness from this one measurement of one type of signal".
* Neurons are very complex and all have unique mutations from one another
* Hundreds of other types of cells in the brain interact with them and each other in ways we don't understand
* The various other parts of the body chemically interact with the brain in ways we don't understand yet, like the gut microbiome
Trying to flatten all of consciousness to one measurement is just not sufficient. It's like trying to simulate the entire planet as a perfect sphere of uniform density. That works OK for some things but falls apart for more complex questions.
I get that, but there's no need to complicate things unnecessarily.
I'll make an even stronger claim, that biological brains are not only computers, but that they operate in binary, as well. Active and inactive - the mechanisms that trigger activation are incredibly nuanced and sophisticated, but the transfer of information through the network of biological neurons is a matter of zeroes and ones. A signal happens, or doesn't. Intensity, from a qualia persepective, ends up being a matter of frequency and spread, as opposed to level of stimulation. That, in conjunction with all sorts of models of brain function, is allowing neuroscience to make steady, plodding progress in determining the function and behavior of different neurons, networks, clusters, and types in the context in which they are found.
All else being equal, at the rate neuroscience is proceeding, we should be able to precisely simulate a human brain, in functionally real-time, using real brain networks as models, by around 2040. We should have a handle on every facet of brain chemistry, networking, electrical signaling, and individual neuronal behavior based on a comprehensive and total taxonomy of feature types down to the molecular level.
Figure out the underlying algorithms and you can migrate those functional structures to purely code. If you can run a mind on code, then it doesn't matter whether you're executing a sequence of computations in a meat brain, in a silicon chip, or using a billion genetically engineered notebook monkeys to painstakingly and tediously do the computations and information transfer manually, passing sheets of paper between them. ( the monkeys, of course, could not operate in real time.)
There won't be another significant phase change, like we saw from hydraulics to computation equivalence. Computation is what it actually, physically is, at the level of electrical signals and molecular behaviors. It's just extremely complex and sophisticated and elegantly interwoven with the rest of the human organism.
Brain gut interactions aren't necessary for human subjective experience or cognition. You could remove your brain entirely from your skull, while maintaining a equivalent level of electrical and chemical signaling from an entirely artifical platform of some sort, and as long as the interface between the biological and synthetic maintains the same signaling frequency, chemistry, and connectivity, then it doesn't matter what's on the synthetic end.
There are independently intelligent aspects to things like the gut biome, and other complex biological systems. Those aren't necessary for brains to do what brains do, except in a supportive role. Decouple the nutrition and evolutionary drives from the mind, and you're left with a fairly small chunk of brain - something like 5B neocortical neurons is the bare minimum of what you'd need to get human level intelligence. Everything on top of that is nice to have, but not strictly necessary from a proof of concept perspective.
3 replies →
There’s nothing in known physics that explains consciousness. I agree about the rest, but consciousness not only defies explanation by known physics, it’s so far beyond what’s known that there isn’t even any concept of what it could be. We barely have the ability to describe it, let alone explain it.
What's special about consciousness? I assume it is just a sequence of thoughts/images inside the brain and that's all? Like when a cat sees a cheese and an image of its taste appears instantly and motivates it to come closer. Human is the same, I think.
1 reply →
Consciousness is very interesting because if you postulate that you can't possibly create it by running a Turing machine, then anything that is simulatable can't be the mechanism behind it. Which would raise the followup question, what is? My money is on some quantum effect.
2 replies →
But is consciousness even a thing? Shouldn't the burden of proof be on the one making a claim that there exists something called consciousness? If they cannot show evidence that such a thing exists or may exist, then for all purposes it does not exist.
7 replies →
> Neocortical networks, with thalamic and hippocampal system integrations, are sufficient to explain the entirety of human experience, in principle.
Where did you get that? That's not an established scientific theorem, it's a philosophical stance (strong physicalist functionalism) expressed as if it were empirical fact. We cannot simulate a full human brain at the correct level of detail, record every spike and synaptic change in a living human brain and we do not have a theory that predicts which neural organizations are conscious just from first principles of physics and network topology.
> We can induce emotions, sights, sounds, smells, memories, moods, pleasure, pain, and anything you can experience through targeted stimulation of neurons in the brain
That shows dependence of experience on brain activity but dependence is not the same thing as reduction or explanation. We know certain neural patterns correlate with pain, color vision, memories, etc. we can causally influence experience by interacting with the brain.
But why any of this electrical/chemical stuff is accompanied by subjective experience instead of just being a complex zombie machine? The ability to toggle experiences by toggling neurons shows connection and that's it, it doesn't explain anything.
> We've got a good enough handle on physics to know that it's not some weird quantum thing, it's not picking up radio signals from some other dimension, and it's not some sort of spirit or mystical phlogiston.
We do have a good handle on how non conscious physical systems behave (engines, circuits, planets, whatever) But we don't have any widely accepted physical theory that derives subjective experience from physical laws. We don't know which physical/computational structures (if any) are sufficient and necessary for consciousness.
You are assuming without any evidence that current physics + it's "all computation" already gives a complete ontology of mind. So what is the consciousness? define it with physics, show me equations, you can't.
> It's a computer, in the sense that anything that processes information is a computer. It's not much like silicon chips or the synthetic computers we build, as far as specific implementation details go.
We design transformer architectures, we set the training objectives, we can inspect every weight and activation of a LLM. Yet even with all that access, tens of thousands of ML PhDs,years of work and we still don't fully understand why these models generalize the way they do, why they develop certain internal representations and how exactly particular concepts are encoded and combined.
If we struggle to interpret a ~10^11 parameter transformer whose every bit we can log and replay, it's a REAL hubris to act like we've basically got a 10^14-10^15 synapse constantly rewiring, developmentally shaped biological network to the point of confidently saying "we know there's nothing more to mind than this, case closed lol".
Our ability to observe and manipulate the brain is currently far weaker than our ability to inspect artificial nets and even those are not truly understood at a deep mechanistic concept level explanatory sense.
> Your mind is the state of your brain as it processes information.
Ok but then you have a problem, if anything that processes information is a computer, and mind is "just computation" then which computations are conscious?
Is my laptop conscious when it runs a big simulation? Is a weather model conscious? Are all supercomputers conscious by default just because they flip bits at scale?
If you say yes, you've gone to an extreme pancomputationalism that most people (including most physicalists) find extremely implausible.
If you say no, then you owe a non hand wavy criterion, what's the principled difference, in purely physical/computational terms between a conscious system (human brain) and a non conscious but still massively computational system (weather simulation, supercomputer cluster)? That criterion is exactly the kind of thing we don't have yet.
So saying "it’s just computation" without specifying which computations and why they give rise to a first person point of view leaves the fundamental question unanswered.
And one more thing your gasoline analogy is misleading, combustion never presented a "hard problem of combustion" in the sense of a first person, irreducible qualitative aspect. People had wrong physical theories, but once chemistry was in place, everything was observable from the outside.
Consciousness is different, you can know all the physical facts about a brain state and still not obviously see why it should feel like anything at all from the inside.
That's why even hardcore physicalist philosophers talk about the "explanatory gap". Whether or not you think it's ultimately bridgeable, it's not honest to say the gap is already closed and the scientific explanation is "sufficient".
Well said.
We can stimulate a nerve and create the experience of pain, but stimulating the nerve does not create the memory of pain.
Nerves triggering sensations I can understand. But stimulating the same nerves, or creating the same electrical activations does not create the memory of the pain.
3 replies →