← Back to context

Comment by sailingparrot

6 months ago

I don’t see how your description “clearly fails to capture the fact that we're conscious” though. There are many example in nature of emergent phenomena that would be very hard to predict just by looking at its components.

This is the crux of the disagreement between those that believe AGI is possible and those that don’t. Some are convinced that we “obviously” more than the sum of our parts, and thus an LLM can’t achieve consciousness because it’s missing this magic ingredient, and those that believe consciousness is just an emergent behaviour from a complex device (the brain). And thus we might be able to recreate it simply by scaling the complexity of another system.

Where exactly in my description do I invoke consciousness?

Where does the description given imply that consciousness is required in any way?

The fact that there's a non-obvious emergent phenomena which is apparently responsible for your subjective experience, and that it's possible to provide a superficially accurate description of you as a system without referencing that phenomena in any way, is my entire point. The fact that we can provide such a reductive description of LLMs without referencing consciousness has literally no bearing on whether or not they're conscious.

To be clear, I'm not making a claim as to whether they are or aren't, I'm simply pointing out that the argument in the article is fallacious.

  • My bad, we are saying the same thing. I misinterpreted your last sentence as saying this simplistic view of the brain you described does not account for consciousness.

    • Ultimately my bad for letting my original comment turn into a word salad. Glad we've ended up on the same page though.