← Back to context

Comment by Cu3PO42

1 day ago

Clearly an LLM is not conscious, after all it's just glorified matrix multiplication, right?

Now let me play devil's advocate for just a second. Let's say humanity figures out how to do whole brain simulation. If we could run copies of people's consciousness on a cluster, I would have a hard time arguing that those 'programs' wouldn't process emotion the same way we do.

Now I'm not saying LLMs are there, but I am saying there may be a line and it seems impossible to see.

Processing them the same way is if course different than feeling them. You'd need a whole body stimulation for that. Your feelings aren't all neurological.

And likewise, a single neuron is clearly not conscious.

I'm increasingly convinced that intelligence (and maybe some form of consciousness?) is an emergent property of sufficiently-large systems. But that's a can of worms. Is an ant colony (as a system) conscious? Does the colony as a whole deserve more rights than the individual ants?