Comment by wizofaus

3 years ago

AI-based image generation is surely already good enough that a single digital photo can't count as evidence alone. But your scenario doesn't make much sense to me - are you suggesting AI will have reached a point it's stored and trained on images of almost everyone's faces, to the point it could accurately/undetectably substitute a blurry face with the detailed version of an actual individual's face it happens to think is similar? I'd be far more worried about deliberate attempts to construct fake evidence - it seems inevitable that eventually we'll have technology to cheaply construct high-quality video and audio media that by current standards of evidence could incriminate almost anyone the framer wanted to.

Look similar to a celebrity? Your face gets replaced, because the number of photos of the celebrity in the corpus outweighs photos of you. And when those doctored photos end up in the corpus, weighting will be even further towards the celebrity So people who look less like the celebrity get replaced, because it is almost certainly them according to the AI. Feeding back until everyone gets replaced by a celebrity face. And then the popular celebrities faces start replacing the less well known celebrities. And we end up with true anonymity, with everyone's face being replaced by John Malkovich.