Comment by ben_w
1 year ago
> And how much data can you give it?
128,000 tokens, which is about the same as a decent sized book.
Their other models can also be fine-tuned, which is kinda unbounded but also has scaling issues so presumably "a significant percentage of the training set" before diminishing returns.
No comments yet
Contribute on Hacker News ↗