← Back to context

Comment by Legend2440

21 hours ago

I think we are just going to have to accept that realistic images can be easily fabricated now.

Seeing is not believing anymore, and I don't think SynthID or anything like it can restore that trust in images.

It's going to mess up accountability.

Some politician will be recorded doing something & he'll have his people release a thousand photos/videos of him doing crimes. And they'll say, look, it's a smear campaign.

This is just one stupid example, but people will have better schemes.

Also global coordinated releases of fake content and hypertargeted possibly abusive content. Virtual kidnappings will take off, automated & scaled.

  • Some politician will be recorded doing something & he'll have his people release a thousand photos/videos of him doing crimes. And they'll say, look, it's a smear campaign.

    And his enemies will do the same, hopefully resulting in less blind trust for everyone in the population, which can only be a good thing.

    • I would’ve paused image models for now until we’ve better educated our less-savvy neighbors.

Hopefully the arms race will balance out with improved AI image detection, but I can see how that will never be guaranteed to be reliable.