Comment by NiloCK
4 days ago
I could be misinterpreting your claim here, but I'll point out that LLM weights don't literally encode the entirety of the training data set.
4 days ago
I could be misinterpreting your claim here, but I'll point out that LLM weights don't literally encode the entirety of the training data set.
I guess you could consider it a lossy encoding.