Comment by sosodev
2 days ago
Nope. Pretraining runs have been moving forward with internet snapshots that include plenty of LLM content.
2 days ago
Nope. Pretraining runs have been moving forward with internet snapshots that include plenty of LLM content.
Sure, but not all of them are stupid enough to keep doing that while watching the model degrade, if it indeed does.