← Back to context

Comment by ninetyninenine

14 days ago

Right but if the horse turns around and starts talking to your wife and doing quantum mechanics you’re going to not believe it at all because it’s not consistent with your old reality?

There are people who agree with you for sure. But none of these people are experts or specifically work deeply on this stuff in academia. People who truly know and are on the cutting edge of this stuff do not agree with you by an overwhelming majority.

The people who do agree with you at best are applied ML engineers but mostly they are arm chair experts who don’t work deeply with this stuff.

> Right but if the horse turns around and starts talking to your wife and doing quantum mechanics you’re going to not believe it at all because it’s not consistent with your old reality?

I will, first, tend to presume it's a David Blaine style illusion, or that I took the wrong gummies. Because that's the most logical, rational, and likely explanation. The crazy explanation - a talking horse with a Ph.D. - requires deeply solid evidence. (And more than one well-qualified person agreeing that they're seeing it, too!)

> There are people who agree with you for sure. But none of these people are experts or specifically work deeply on this stuff in academia. People who truly know and are on the cutting edge of this stuff do not agree with you by an overwhelming majority.

Go on. Which deep academic experts are saying LLMs are conscious?

  • > Go on. Which deep academic experts are saying LLMs are conscious?

    None. But they all claim they don’t understand it.

    Look at yourself. When did I claim the LLM is conscious? Never did. I said it’s a possibility but mostly we don’t know wth is going on. Your statement here was not an intentional lie but it was incorrect and made up. Aka an hallucination: which makes you not much more different than an LLM.

    > I will, first, tend to presume it's a David Blaine style illusion, or that I took the wrong gummies. Because that's the most logical, rational, and likely explanation. The crazy explanation - a talking horse with a Ph.D. - requires deeply solid evidence. (And more than one well-qualified person agreeing that they're seeing it, too!)

    Let’s say the horse was in actuality doing quantum mechanics. We looked from every angle and it’s actually writing equations down that are sometimes wrong but often correct and often both correct and novel. We have no way of reading the horses mind but every person on the face of the earth and horse psychologists are seeing the same thing. But nobody knows how the horses does it because we simply can’t read the horses mind.

    Then you come along. Genius that you are. And say the horse is just a stochastic parrot! The writing is an illusion and the horse doesn’t really understand anything! And you have no evidence to show for it. But you make that steadfast claim anyway even without evidence. The horse continues to write working equations you have no explanation for how but you say it doesn’t know quantum mechanics.

    That’s what you are. You’re making a claim but you have no evidence for the claim. Similar to religion. You simply decide to believe in something and then use endless analogies in your arguments which you don’t realize aren’t evidence for anything.

    • > None. But they all claim they don’t understand it.

      Do any claim it is likely that LLMs are conscious? Or do they agree with me?

      > Look at yourself. When did I claim the LLM is conscious? Never did. I said it’s a possibility but mostly we don’t know wth is going on.

      Look at yourself. When did I claim it’s impossible? Never did. I said it’s unlikely.

      > Let’s say the horse was in actuality doing quantum mechanics.

      But we aren’t at that point with LLMs. Hence, I say it’s unlikely. Not impossible.

      You’re so wound up you’re projecting and doing exactly what you falsely accuse others of doing.

      3 replies →