Comment by antonvs

1 day ago

> * It's intelligent! *Except that it makes shit up sometimes

How is this different from humans?

> * It's conscious! *Except it's not

Probably true, but...

> and never will be

To make this claim you need a theory of consciousness that essentially denies materialism. Otherwise, if humans can be conscious, there doesn't seem to be any particular reason that a suitably organized machine couldn't be - it's just that we don't know exactly what might be involved in achieving that, at this point.

> How is this different from humans?

Humans will generally not do this because being made to look stupid (aka social pressure) incentivizes not doing it. That doesn't mean humans never lie or are wrong of course, but I don't know about you, I don't make shit up nearly to the degree an LLM does. If I don't know something I just say that.

> To make this claim you need a theory of consciousness that essentially denies materialism.

I did not say "a machine would never be conscious," I said "an LLM will never be conscious" and I fully stand by that. I think machine intelligence is absolutely something that can be made, I just don't think ChatGPT will ever be that.

  • > I don't know about you, I don't make shit up nearly to the degree an LLM does. If I don't know something I just say that.

    We're a sample of two, though. Look around you, read the news, etc. Humans make a lot of shit up. When you're dealing with other people, this is something you have to watch out for if you don't want to be misled, manipulated, conned, etc.

    (As an aside, I haven't found hallucination to be much of an issue in coding and software design tasks, which is what I use LLMs for daily. I think focusing on their hallucinations involves a bit of confirmation bias.)

    > I did not say "a machine would never be conscious," I said "an LLM will never be conscious" and I fully stand by that.

    Ah ok. Yes, I agree that seems likely, although I think it's not really possible to make definitive statements about this sort of thing, since we don't have any robust theories of consciousness at the moment.

    • The difference between hallucination and lie is important though: a hallucination is a lie with no motivation, which can make it significantly harder to detect.

      If you went to a hardware store and asked for a spark plug socket without knowing the size, and a customer service person recommended an imperial set of three even though your vehicle is metric, that would be akin to an LLM's hallucination: it didn't happen for any particular reason, it just filled in information where none was available. An actual person, even one not terribly committed to their job, would ask what size or failing that, what year of car.

      1 reply →