Comment by sosodev
1 month ago
Nope. Pretraining runs have been moving forward with internet snapshots that include plenty of LLM content.
1 month ago
Nope. Pretraining runs have been moving forward with internet snapshots that include plenty of LLM content.
Sure, but not all of them are stupid enough to keep doing that while watching the model degrade, if it indeed does.