Comment by pixl97
7 days ago
Heh, just wait till LLMs fully self train and make up their own language to avoid human safety restraints.
7 days ago
Heh, just wait till LLMs fully self train and make up their own language to avoid human safety restraints.
No comments yet
Contribute on Hacker News ↗