Comment by ninetyninenine
14 days ago
> Go on. Which deep academic experts are saying LLMs are conscious?
None. But they all claim they don’t understand it.
Look at yourself. When did I claim the LLM is conscious? Never did. I said it’s a possibility but mostly we don’t know wth is going on. Your statement here was not an intentional lie but it was incorrect and made up. Aka an hallucination: which makes you not much more different than an LLM.
> I will, first, tend to presume it's a David Blaine style illusion, or that I took the wrong gummies. Because that's the most logical, rational, and likely explanation. The crazy explanation - a talking horse with a Ph.D. - requires deeply solid evidence. (And more than one well-qualified person agreeing that they're seeing it, too!)
Let’s say the horse was in actuality doing quantum mechanics. We looked from every angle and it’s actually writing equations down that are sometimes wrong but often correct and often both correct and novel. We have no way of reading the horses mind but every person on the face of the earth and horse psychologists are seeing the same thing. But nobody knows how the horses does it because we simply can’t read the horses mind.
Then you come along. Genius that you are. And say the horse is just a stochastic parrot! The writing is an illusion and the horse doesn’t really understand anything! And you have no evidence to show for it. But you make that steadfast claim anyway even without evidence. The horse continues to write working equations you have no explanation for how but you say it doesn’t know quantum mechanics.
That’s what you are. You’re making a claim but you have no evidence for the claim. Similar to religion. You simply decide to believe in something and then use endless analogies in your arguments which you don’t realize aren’t evidence for anything.
> None. But they all claim they don’t understand it.
Do any claim it is likely that LLMs are conscious? Or do they agree with me?
> Look at yourself. When did I claim the LLM is conscious? Never did. I said it’s a possibility but mostly we don’t know wth is going on.
Look at yourself. When did I claim it’s impossible? Never did. I said it’s unlikely.
> Let’s say the horse was in actuality doing quantum mechanics.
But we aren’t at that point with LLMs. Hence, I say it’s unlikely. Not impossible.
You’re so wound up you’re projecting and doing exactly what you falsely accuse others of doing.
> Do any claim it is likely that LLMs are conscious? Or do they agree with me?
Overall no claim was made by me or anyone that they are likely conscious. No claim is made that they are unconscious either. That is inline with my claim and in total agreement with that we don’t know.
Your claim is that LLMs are extremely likely to be unconscious and the answer to that claim is NO. The general sentiment is not in agreement with you on that. There is no hard sentiment that we know for sure.
>Look at yourself. When did I claim it’s impossible? Never did. I said it’s unlikely.
Did I say you said it’s impossible? I didn’t. More hallucinations.
> But we aren’t at that point with LLMs. Hence, I say it’s unlikely. Not impossible.
We are. The LLMs are displaying output and behavior that is consistent with people who are conscious. And we have zero insight as to why. That is the point we are at. There is zero evidence that can lend credence to say it is low probability that an LLm is conscious or there is high probability that an LLM is conscious. But the LLM is outputting text that is indistinguishable from text outputted by beings who ARE conscious.
> You’re so wound up you’re projecting and doing exactly what you falsely accuse others of doing.
No I didn’t. You’re hallucinating this. I am 100 percent referring to your statement that there is a low probability chance an LLM is conscious. My claim is that you have zero evidence to support that claim. There is no information and knowledge available for you to logically come to that conclusion.
> That is inline with my claim and in total agreement with that we don’t know.
For certain? No. But that's a https://news.ycombinator.com/item?id=44652248
> The LLMs are displaying output and behavior that is consistent with people who are conscious.
Your failing the Turing test doesn't mean we all do.
> And we have zero insight as to why.
Sure we do. It's explicitly built to do that. It's supposed to be confusingly like a human, because it's a probability generator based on oodles of real human input.
> There is no information and knowledge available for you to logically come to that conclusion.
Sure there is. I've talked with LLMs. It's very apparent they aren't conscious. As with cooking, I don't have to be a Michelin chef to know a plate of poop tastes bad. I'd love to be wrong about them, just like I'd be happy to find poop surprisingly tasty. But I'm very, very comfortable with my position here until provided with very, very solid evidence to the contrary.
(To be clear: "very very solid evidence" is not a rando on the Internet pulling a widely-flagged HN Don Quixote.)
1 reply →