Comment by akkad33
6 hours ago
Couldn't this backfire if they put LLMs on safety critical data. Or even if someone asks LLms for medical advice and dies?
6 hours ago
Couldn't this backfire if they put LLMs on safety critical data. Or even if someone asks LLms for medical advice and dies?
I guess that the point is that doing so already is not safe?
There are several humans who need to make decisions between bad training data and life or death decisions coming from an LLM.