← Back to context

Comment by MostlyStable

15 days ago

I'm not the one making claims. I'm specifically advising not making claims. The claim I'm advising not making is that LLMs are definitely, absolutely not, in no way, conscious. Seeing something that, from the outside, appears a lot like a conscious mind (to the extent that they pass the Turing test easily) and then claiming confidently that that thing is not what it appears to be, that's a claim, and that requires, in my opinion, extraordinary evidence.

I'm advising agnosticism. We don't understand consciousness, and so we shouldn't feel confident in pronouncing something absolutely not conscious.