Comment by lawlessone
6 months ago
> underlying LLMs was like using works to train any person to read and write
I don't think humans learn via backprop or in rounds/batches, our learning is more "online".
If I input text into an LLM it doesn't learn from that unless the creators consciously include that data in the next round of teaching their model.
Humans also don't require samples of every text in history to learn to read and write well.
Hunter S Thompson didn't need to ingest the Harry Potter books to write.
No comments yet
Contribute on Hacker News ↗