Comment by shinycode
7 hours ago
The same kind of proof we accept for any scientific claim: converging, reproducible evidence that rules out competing explanations.
Concretely, that means: We already have indirect evidence: conscious states vary predictably with brain states. Damage specific regions, lose specific functions. Alter chemistry, alter experience. This is not proof, but it’s systematic dependence, which is exactly what emergence predicts. Stronger evidence would look like precise, bidirectional mappings between neural activity and reported experience: to the point where you could reliably read subjective states from brain data, or induce specific experiences through targeted stimulation. We’re already moving in that direction.
The hardest bar would be building a system from physical components, having it report coherent subjective experience, and being able to explain why that configuration produces experience while others don’t. That’s the hard problem: and no, we’re not there yet. And it’s worth being honest: we’ve been assuming physicalism will eventually solve it, but there’s no guarantee that’s true rather than hopeful. The fact that brain states correlate with conscious states doesn’t explain why there is something it is like to have those states. Correlation is not mechanism.
But here’s the key point: you’re implicitly holding emergence to a standard of certainty that no scientific theory meets. We don’t have that standard of proof for evolution, gravity, or quantum mechanics either. We have overwhelming evidence that makes alternatives implausible.
So the question isn’t “can you prove it beyond all doubt?” It’s “does the evidence favor it over alternatives?” Right now, it does — but that’s a pragmatic verdict, not a metaphysical one. Idealist frameworks like Kastrup’s or Faggin’s remain serious contenders. The debate is more open than mainstream science often admits.
> The hardest bar would be building a system from physical components, having it report coherent subjective experience
So like if i finetune an LLM in a loop to tell you that it is feeling a coherent subjective experience would you accept that?
Does that mean that no dog has ever been conscious, because they cannot report a coherent subjective experience? (Because they can’t report anything at all. Being non-verbal.)
> you’re implicitly holding emergence to a standard of certainty that no scientific theory meets.
Wtf? I asked what kind of proof would you accept. How is that holding anyone to any kind of standard? Let alone one which is too high.
Yeah you’re raising three good points and they all land. On the finetuned LLM: you’re right, that criterion was flawed. A system trained to report experience proves nothing about whether experience is present, which is actually the core of the hard problem. No behavioral output alone can confirm inner experience. That applies to LLMs, and technically to other humans too. On dogs, also a fair correction. We don’t actually require verbal report to attribute consciousness to animals, we use behavioral and physiological evidence. So "coherent verbal report" was too narrow.
Better criterion: a system whose overall architecture and behavior is consistent with experience, not just one that says the right words.
On the standard of proof: that was a rhetorical deflection and you’re right to call it out. You asked a genuine question and got it turned back on you. And you’re pointing at something real: in science, strong correlation is not accepted as proof when stricter evidence is achievable. The reason we settle for correlation here isn’t because it’s sufficient, it’s because subjective experience may make stronger proof structurally inaccessible. But it’s also worth noting that scientific consensus has a poor track record of admitting this honestly. Dominant paradigms tend to defend themselves long past the point where the cracks are visible, physicalism on consciousness is no exception. The confidence with which emergence is presented often reflects institutional momentum as much as evidence.