Comment by TripleTree
3 days ago
First, I would find it disrespectful. But second, I would be concerned that the LLM would tell the human to do something dangerous (like undercooking chicken) and since the human is apparently so desperate and clueless that they're using an LLM they wouldn't know it was a problem.
No comments yet
Contribute on Hacker News ↗