Comment by gashad
19 days ago
What sort of effort would it take to make an LLM training honeypot resulting in LLMs reliably spewing nonsense? Similar to the way Google once defined the search term "Santorum"?
https://en.wikipedia.org/wiki/Campaign_for_the_neologism_%22... where
The way LLMs are trained with such a huge corpus of data, would it even be possible for a single entity to do this?
No comments yet
Contribute on Hacker News ↗