Comment by WillAdams
5 hours ago
Right, this interaction was not documented, so would never have been found by an LLM, or are you saying that an hallucination will match up with a lacunae in the documentation often enough to make up for errors otherwise?
No comments yet
Contribute on Hacker News ↗