← Back to context

Comment by ben_w

15 hours ago

A lot of people are now struggling to detect which images are AI generated, and inferring reality from illusions.

To an extent, this was already the case with many other things, including stuff that was expressly labelled as fiction, but I recall an old quote, fooling all of the people some of the time and some of the people all of the time, it is now easier to fool more people all the time and to fool all people an increasing fraction of the time.

This isn't only limited to fake pics of kids, but kids are weak and struggle to defend themselves, and in this context the tools faking them seems to me likely to increase rates of harm against them.

in this context the tools faking them seems to me likely to increase rates of harm against them.

Why does it seem this way to you?

  • The history of age of consent laws including Pitcairn Island, the observed results of sexualised deepfakes in classrooms by other students, and the observation that according to sexual therapists "fetishisation" is the development of a sexual response and conversion into a requirement over the course of repeated exposure rather than any innate tendency that a person is born with.

    • I read that in a book, in a study, they were able to cause people to be aroused by money by pairing it with sexy bits. But later, they lost the association pretty quickly.

      This doesn't contradict what you are saying, and the study could be like most psychology (unreplicated), but it seems the impact is minor... But minor on 6 billion people could be terrible for a few people.