Comment by tgv
7 days ago
It's not only the excess, it's the ease of access. Kids can produce lewd pics of class mates, and make their lives hell. This technology is fundamentally evil.
7 days ago
It's not only the excess, it's the ease of access. Kids can produce lewd pics of class mates, and make their lives hell. This technology is fundamentally evil.
I know it's not the same - but I remember the "bubbling" phase a few years back. It was a bit messy and fortunately faded away pretty quick
This argument is repeated relatively often but I can't take it seriously. Just how???
With as easy and widespread it is I wonder how long before the general assumption will be that nude pics and videos are fakes and will loose its power. It will be just another ai porn on the mountain of other shitty ai porn.
While I agree that the default assumption might be that any given nude is actually a fake, I’m curious if it’ll lose any power over someone. If someone generates an incredible AI image of a lightning strike, I don’t care that it’s fake. It’s beautiful and I want to stare at it. I’m going to share it with my friends.
I fear the same would be true for high-quality (real or otherwise) nude images and I’d wager this would still draw unwanted attention towards the subject. Hopefully there wouldn’t be negative consequences from employers and schools since the assumption is the image was fake.