Comment by EGreg
14 days ago
Actually, the LLMs made me realize John Searle’s “Chinese room” doesnt make much sense
Because languages have many similar concepts so the operator inside the Chinese room can understand nearly all the concepts without speaking Chinese.
And the LLM can translate to and from any language trivially, the inner layers do the actual understanding of concepts.
No comments yet
Contribute on Hacker News ↗