Comment by parineum
4 days ago
Everything is just trivia until you have a use for the answer.
OP provided a we link with the answer, aren't these models supposed to be trained on all of that data?
4 days ago
Everything is just trivia until you have a use for the answer.
OP provided a we link with the answer, aren't these models supposed to be trained on all of that data?
There is nothing useful you can do with this information. You might as well memorize the phone book.
The model has a certain capacity -- quite limited in this case -- so there is an opportunity cost in learning one thing over another. That's why it is important to train on quality data; things you can build on top of.
What if you are trying to fix one of these things and needed a list of replacement parts?
Not the right problem for this model. Any RAG-backed SLM would do; the important part is being backed by a search engine, like https://google.com/ai
Just because it's in the training data doesn't mean the model can remember it. The parameters total 60 gigabytes, there's only so much trivia that can fit in there so it has to do lossy compression.