Comment by throwaway0123_5
3 days ago
This definition makes sense, but in the context of LLMs it still feels misapplied. What the model providers call "guardrails" are supposed to prevent malicious uses of the LLMs, and anyone trying to maliciously use the LLM is "explicitly trying to get off the road."
No comments yet
Contribute on Hacker News ↗