← Back to context

Comment by quonn

6 months ago

> LLMs are not conscious because unlike human brains they don't learn or adapt (yet).

That's neither a necessary nor sufficient condition.

In order to be conscious, learning may not be needed, but a perception of the passing of time may be needed which may require some short-term memory. People with severe dementia often can't even remember the start of a sentence they are reading, they can't learn, but they are certainly conscious because they have just enough short-term memory.

And learning is not sufficient either. Consciousness is about being a subject, about having a subjective experience of "being there" and just learning by itself does not create this experience. There is plenty of software that can do some form of real-time learning but it doesn't have a subjective experience.

You should note that "what is consciousness" is still very much an unsettled debate.

  • But nobody would dispute my basic definition (it is the subjective feeling or perception of being in the world).

    There are unsettled questions but that definition will hold regardless.