← Back to context

Comment by raincole

7 hours ago

There is no solution. I don't know why people discuss this subject as if there is a technical solution. As if there are fairies or souls hidden in the pixels that help us tell what is AI generated and what is not.

If you want to make an AI generated image but don't want other people to know that it's AI, the most obvious solution is to not use Gemini. Synth ID is watermarking. It's only ever going to be useful to good actors, who want an AI generated image and aren't trying to hide the fact that it's AI generated.

Sure there is a solution, you are just looking at it the wrong way. Make non-AI images provably unaltered with signed keys from the device (e.g. the camera) that took it.

  • That's pretty much impossible though.

    One workflow that some artists use is that they draw with ink on paper, scan, and then digitally color. Nothing prevents someone from generating line art using generative AI, printing it, scanning it, and coloring it.

    And what if someone just copy pastes something into Photoshop or imports layers? That's what you'd do for composites that mix multiple images together. Can one copy paste screenshots into a multi layer composition or is that verboten and taints the final image?

    And what about multi program workflows? Let's say I import a photo, denoise it in DxO, retouch in affinity photo, resize programmatically using image magick, and use pngcrush to optimize it, what metadata is left at the end?

  • If the premise is that everyone would just agree on the same protocol, I have an even more unbreakable solution: every image has to be upload to a blockchain the moment it is (claimed to be) created. Otherwise it's AI.

    If only everyone just agrees with me.

  • Next comes registration your camera with the government to ensure you're not doing "bad" things with it.

  • Which works for about 5 minutes until someone leaks a manufacturer's private key or extracts it from a device...