Comment by goatlover

2 years ago

The Chinese Room argument always made sense to me. Machine translation only understands the rules for translating X to Y. It does not understand what X and Y mean, as in the way humans apply language to the world and themselves. How could it?

LLMs are a step beyond that, though. As in they do encode language meanings in their weights. But they still aren't connected to the world itself. Things are only meaningful in word relations, because that's how humans have created the language.

How do you know I understand X and Y and not just apply some mechanistic rules for producing this text? Even in the Chinese Room, to make it reasonably efficient, you'd need some shortcuts, some organization, some algorithm to do it. How is that different from some kind of understanding?

  • Because we have bodies that interact with the world and each other, and that's what language is based on. It's like computer science people completely forget how we evolved and created languages. Or how kids learn.

  • > Even in the Chinese Room, to make it reasonably efficient

    That's the point - the brain isn't a Chinese room...

What if I gave you the complete description of how the brain of a person that speaks both Chinese and English is organised, you could simulate what happens when that person reads Chinese after being told to translate to English. Does that mean that that person cannot translate from Chinese to English just because you could (in theory, of course) do it without speaking Chinese yourself?

Yes, the algorithm is much more complicated, and we obviously don't have the capacity to map a brain like that, but to imply that there's anything except the laws of physics that governs it is... well, not very scientific.

  • I never said the system couldn't translate Chinese to English, only that doesn't understand the meanings of the words it's translating, because they're ungrounded symbols. Words have meanings because they're about something. Searle never said a machine in principle couldn't understand, only that symbol manipulation isn't enough.

    Obviously if we made something like Data from Star Trek, it would understand language.