Comment by madeofpalk
20 hours ago
It’s apt, because the only thing LLMs is hallucinate because they have no grounding in reality. They take your input and hallucinate to do something “useful” with it.
20 hours ago
It’s apt, because the only thing LLMs is hallucinate because they have no grounding in reality. They take your input and hallucinate to do something “useful” with it.
No comments yet
Contribute on Hacker News ↗