← Back to context

Comment by belter

14 days ago

> whether or not an LLM is conscious in the same way as a human being

The problem is... that there is a whole amount of "smart" activities humans do without being conscious of it.

- Walking, riding a bike, or typing on a keyboard happen fluidly without conscious planning of each muscle movement.

- You can finish someone sentence or detect if a sentence is grammatically wrong, often without being able to explain the rule.

- When you enter a room, your brain rapidly identifies faces, furniture, and objects without you consciously thinking, “That is a table,” or “That is John.”

Indeed, the "rider and elephant" issue.

During Covid I gave a lecture on Python on Zoom in a non-English language. It was a beginner's topic about dictionary methods. I was attempting to multi-task and had other unrelated tasks open on second computer.

Midway through the lecture I noticed to my horror that I had switched to English without the audience noticing.

Going back through the recording I noticed the switch was fluid and my delivery was reasonable. What I talked about was just as good as something presented by LLM these days.

So this brings up the question - why aren't we p-zombies all the time instead of 99% of time?

Are there any tasks that absolutely demand human consciousness as we know it?

Presumably long term planning is something that active human consciousness is needed.

Perhaps there is some need for consciousness when one is in "conscious mastery" phase of acquiring a skill.

This goes for any skill such as riding a bicycle/playing chess/programming at a high level.

Once one reaches "unconscious mastery" stage the rider can concentrate on higher meta game.