Comment by ACCount37

18 hours ago

I'd rather the "AI safety" of the kind you want didn't exist.

The catastrophic AI risk isn't "oh no, people can now generate pictures of women naked".

Why would you rather it not exist?

In a vacuum, I agree with you that there's probably no harm in AI-generated nudes of fictional women per se; it's the rampant use to sexually harass real women and children[0], while "causing poor air quality and decreasing life expectancy" in Tennessee[1], that bothers me.

[0]: https://arstechnica.com/tech-policy/2026/01/x-blames-users-f...

[1]: https://arstechnica.com/tech-policy/2025/04/elon-musks-xai-a...

  • Because it's just a vessel for the puritans and the usual "cares more about feeling righteous than about being right" political activists. I have no love for either.

    The whole thing with "AI polluting the neighborhoods" falls apart on a closer examination. Because, as it turns out, xAI put its cluster in an industrial area that already has: a defunct coal power plant, an operational steel plant, and an operational 1 GW grid-scale natural gas power plant that powers the steel plant - that one being across the road from xAI's cluster.

    It's quite hard for me to imagine a world where it's the AI cluster that moves the needle on local pollution.

    • > Because it's just a vessel for the puritans and the usual "cares more about feeling righteous than about being right" political activists. I have no love for either.

      People are (or were, as of a couple of weeks ago) having nude images generated from pictures of them, and were posted publicly, by a company worth billions. You think the outage around that is just puritan hand-wringing? Not that real people are actively being harassed?