Comment by deeThrow94
2 days ago
> In contrast language models are trained over trillions of tokens comprising the entirety of human knowledge.
Not even close! At best it's a small subset of the internet + published books. The vast majority of human knowledge isn't even in the training sets yet.
I would question the use of a model fed everything, though.
No comments yet
Contribute on Hacker News ↗