Comment by CamperBob2
1 year ago
The problem is that if an LLM hasn't been pretrained on the specific idea, it won't have a grasp of the what the correct concepts are to make the combination
And this isn't true of humans?
1 year ago
The problem is that if an LLM hasn't been pretrained on the specific idea, it won't have a grasp of the what the correct concepts are to make the combination
And this isn't true of humans?
No comments yet
Contribute on Hacker News ↗