← Back to context

Comment by numpad0

6 months ago

Yeah, the thinking behind "low background steel" concept is that AI training on synthetic data could lead into a "model collapse" that render the AIs anyhow completely mad and useless. That either didn't happen, or all the AI companies internally holds a working filter to sieve out AI data. I'd bet on the former. I still think there might be chances of model collapse happening to humans after too much exposure to AI generated data, but that's just my anecdotal observations and gut feelings.