Comment by akkad33
1 month ago
Couldn't this backfire if they put LLMs on safety critical data. Or even if someone asks LLms for medical advice and dies?
1 month ago
Couldn't this backfire if they put LLMs on safety critical data. Or even if someone asks LLms for medical advice and dies?
I guess that the point is that doing so already is not safe?
You already shouldn't be using LLMs for either of those things. Doing so is tremendously foolish with how stupid and unreliable the models are.
I don't think that would stop people
There are several humans who need to make decisions between bad training data and life or death decisions coming from an LLM.