Comment by flornt
15 days ago
I guess it's because LLM does not understand the meaning as you understand what you read or thought. LLMs are machines that modulate hierarchical positions, ordering the placement of a-signifying sign without a clue of the meaning of what they ordered (that's why machine can hallucinate :they don't have a sense of what they express)
No comments yet
Contribute on Hacker News ↗