Comment by thesz
12 days ago
You are right about "LLM improving themselves" is impossible. In fact, LLM make themselves worse, it is called knowledge collapse [1].
12 days ago
You are right about "LLM improving themselves" is impossible. In fact, LLM make themselves worse, it is called knowledge collapse [1].
That paper again.
LLMs have been trained on synthetic outputs for quite a while since then and they do get better.
Turns out there's more to it than that.