← Back to context

Comment by emp17344

14 days ago

>once the person making them is able to explain, in detail and with mechanisms, what it is the human brain does that allows it to do these things, and in what ways those detailed mechanisms are different from what LLMs do.

Extraordinary claims require extraordinary evidence. The burden of proof is on you.

I'm not the one making claims. I'm specifically advising not making claims. The claim I'm advising not making is that LLMs are definitely, absolutely not, in no way, conscious. Seeing something that, from the outside, appears a lot like a conscious mind (to the extent that they pass the Turing test easily) and then claiming confidently that that thing is not what it appears to be, that's a claim, and that requires, in my opinion, extraordinary evidence.

I'm advising agnosticism. We don't understand consciousness, and so we shouldn't feel confident in pronouncing something absolutely not conscious.