Comment by pixl97
8 days ago
Heh, just wait till LLMs fully self train and make up their own language to avoid human safety restraints.
8 days ago
Heh, just wait till LLMs fully self train and make up their own language to avoid human safety restraints.
No comments yet
Contribute on Hacker News ↗