Comment by partiallypro
2 years ago
> With JPEG in the intent is produce an image indistinguishable from the original.
Not necessarily, and even if so, if you continuously opened and saved a JPEG image it would turn to a potato quality image eventually, Xerox machines do the same thing. Happens all the time with memes, and old homework assignments. What I fear is this happening to GPT, especially when people just start outright using its content and putting it on sites. Then it becomes part of what GPT is trained on later on, but what it had previously learned was wrong, so it just progressively gets more and more blurred, with people using the new models to produce content, with a feedback loop that just starts to blur truth and facts entirely.
Even if you tie it to search results like Microsoft is doing, eventually the GPT generated content is going to rise to the top of organic results because of SEO mills using GPT for content to goose traffic...then all the top results agree with the already wrong AI generated answer; or state actors begin gaming the system and feeding the model outright lies.
This happens in people too, sure, but in small subsets not in monolithic fashion with hundreds of millions of people relying on the information being right. I have no idea how they can solve this eventual problem, unless they are just supervising what it's learning all the time; but then at the point it can become incredibly biased and limited.
No comments yet
Contribute on Hacker News ↗