Comment by search_facility
15 hours ago
I can not simulate my brain, it's a huge stretch to imply this.
But with LLMs - anyone can simulate LLM. LLM can be simulated without any uncertainties in pen and paper and a lot of time. Does it mean that 100 tons of paper plus 100 years of time (numbers are just examples) calculating long formulae makes this pile of paper consiousness? Imho answer is definitive no.
I don’t think anyone is arguing the silicon is conscious.
Similarly the paper.
What about the agent doing the calculations.
He may be conscious. Or anyway, we can’t rule it out.
With both cases the agent doing something is human. And human beings are indeed conscious. Outside human needs LLM are useless.
Math, as a tool, is just a proxy for people using LLMs, as well as GPUs spending cycles on calculating the math