← Back to context

Comment by enaaem

13 hours ago

The problem is you can undress real people and that is extremely harmful and dangerous. One kid took his life after an ai sextortian scam [1]. Imagine the damage cyberbullies, scammers and stalkers can do?

[1] https://www.cbsnews.com/news/sextortion-generative-ai-scam-e...

Imagine how freeing it will be when people stop caring about this stuff because anyone can see anyone else naked in about 5 seconds. We're basically already at realistic hardcore porn videos of anyone fucking anyone else in a few minutes. No point in worrying about it, and it even serves as a shield for real leaked revenge porn - just claim it's AI.

  • This take is so bleak man.

    It's creepy and uncomfortable when someone says out loud that they're imagining you doing sex acts!

    Even if everyone knows that you're not actually doing sex acts and it's just some guy imagining it!

    Now everyone has to see what these creeps are imagining, but it's fine because it's AI? Like actually are you out of your mind?

Yeah like I said. With consent of the people involved.

There must be a way to do that. Especially with all the facial req chops these days. Also, you could simply refuse using existing images. I don't see why they wouldn't refuse that because that's a pretty narrow usecase with very few benign purposes.

> Imagine the damage cyberbullies, scammers and stalkers can do?

They already can. There's open-source models out there.

This has been fixed months ago. From reading Reddit, Grok is now really conservative about what it will let you do with uploaded images. But you can get it to draw x rated porn images and videos that start with Ai images it creates

> The problem is you can undress real people and that is extremely harmful and dangerous.

But... that's not something you can do. It's impossible.

You can imagine what real people look like naked. That's not a new thing.

https://www.youtube.com/watch?v=p7FCgw_GlWc

  • Imagining what someone looks like in your mind is far different than actively sharing fake nude images online. This cannot be a serious comparison.

    • Actively sharing fake nude images online has always been legal. It's not even a close question. The practice is neither harmful nor dangerous. Did you look at that link?

    • Yes but the genie is out of the bottle as web say. Deepfakes and AI gen are here to stay. We can try to go after every tool out there but it'll be just as effective as the 'war on drugs'.

      We'll just have to adapt as a society and realise that what you see is not what you get anymore, in other words most of what we're going to see is false.