Comment by anileated
3 days ago
An LLM is a tool. If the tool is not supposed to do something yet does something anyway, then the tool is broken. Radically different from, say, a soldier not following an illegal order, because soldier being a human possesses free will and agency.
No comments yet
Contribute on Hacker News ↗