Comment by CamperBob2
2 years ago
The problem is that if an LLM hasn't been pretrained on the specific idea, it won't have a grasp of the what the correct concepts are to make the combination
And this isn't true of humans?
2 years ago
The problem is that if an LLM hasn't been pretrained on the specific idea, it won't have a grasp of the what the correct concepts are to make the combination
And this isn't true of humans?
No comments yet
Contribute on Hacker News ↗