Comment by dpark

9 hours ago

> What I am trying to say is that we can only agree something is conscious, and only if it's working on the same principles a human brain does, closely. It's an agreement, not proof, not definitions. We collectively start accepting it, without KNOWING. And the safest way to do that is on something which is working exactly like a human brain. Anything else we can only lose certainty.

This means that "consciousness" is simply a synonym for "human".

By that "agreement", sure, a machine cannot be conscious. But I don't think this is what most people mean when they talk about whether an LLM could be conscious. Because of course it's not human. So they must be asking something more interesting.