Comment by digitaltrees

14 hours ago

Sensory data is a specific data set that corresponds to phenomena in the world. But to say that LLMs don’t have senses merely because they are linguistic or computational doesn’t follow when they can take in data from the world that similarly reflects something about the world.

They don't have senses because they don't have a body. It's just a program. Do weights on a hard drive have consciousness? Does my installation of starcraft have consciousness? It doesn't make any sense.

  • Bodies aren’t necessary for senses. I can send a picture to Claude. I can send a series of pictures. That’s usually called a sense of vision. I could connect it to a pressure sensor and that would be touch.

  • There are robots with AI controlling them, so it doesn't hold that they don't all have bodies. They can see, they can move.

    (I'm still not sure that that makes them conscious, or if we can even determine that at all, but I don't think that's a fair argument.)

  • > They don't have senses because they don't have a body

    Surely "having senses" is predicated more on "being able to sense the world around you" than "having a body."

    > Does my installation of starcraft have consciousness?

    Can your installation of StarCraft take in information about the world and then reason about its own place in that world?

  • The weights on your hard drive might have consciousness if they can respond to stimuli in ways other conscious brains do. That’s the whole point of the Turing test, it’s a criteria for when the threshold of reasonable interpretation is crossed.