Comment by dcreater
4 hours ago
Its already here - its called GEO and their are silicon valley startups already pumping out crap to feed next gen models so that you ensure you're product is baked into the weights
4 hours ago
Its already here - its called GEO and their are silicon valley startups already pumping out crap to feed next gen models so that you ensure you're product is baked into the weights
The next gen of models are going to need very strict sanitising of input articles as I think the sheer volume of GPT SEO spam is going to be, or already is, quite staggering. Model collapse might not be what happens but certainly a dilution of quality in training data.
This looks like a workflow problem more than a model problem. When inputs aren’t controlled, scale amplifies noise faster than understanding. Tools improve, but the decision boundaries stay the bottleneck.