← Back to context

Comment by logicprog

8 hours ago

Yeah the disapproval/disgust I'm seeing everywhere, from pretty much every side that I keep my eye on, about OpenAI enabling erotica generation with ChatGPT is so frustrating, because it seems like just Puritanism and censorship, and desiring to treat adults like children as you say.

The issues that these pseudo-relationships can cause have barely begun to be discussed, nevermind studied and understood.

We know that they exist, and not only for people with known mental health issues. And that's all we know. But the industry will happily brush that aside in order to drive up those sweet MAU and MRR numbers. One of those, "I'm willing to sacrifice [a percentage of the population] for market share and profit" situations.

Edits: grammar

  • People form parasocial relationships with AI already with content restrictions in place. It seems to me that that is a separate issue entirely.

  • That's kind of patronizing position or maybe a conservative one (in US terms). There can be harm, there can be good, nobody can say at this moment for sure which is more.

    Do you feel the same about say alcohol and cigarettes? We allow those, heck we encourage those in some situations for adults yet they destroy whole societies (look at russia with alcohol, look at Indonesia for cigarettes if you haven't been there).

    I see a lot of points to discuss and study but none to ban with parent's topic.

    • We did finally come around to the point of restricting advertising and sale of cigarettes, and limiting where you could smoke, to where it is much less prevalent in today's generation than earlier generations.

      The issue is it becoming ubiquitous in an effort to make money.

Looks like OpenAI can do anything it desires, but if an indie artist tries to take money for NSFW content, or even just make it for free publicly - they get barred from using payment processors and such.

It is not bad per se but in my opinion it shows that OpenAI is desperately trying to stop bleeding money.

  • I mean, their issue isn't that not enough users are using ChatGPT, so they need to enable new user modalities to draw more people in — they already have something like 800 million MAU. Their issue is that most of their tokens are generated free right now both from those users and stuff like CoPilot, and they're building stupidly huge unnecessary data enters to scale their way to "AGI." So yeah, everyone says this looks like a sign of desperation, but I just don't see it at all, because it would solve a problem they don't actually have (not enough people finding GPT useful).

    • If you re--calibrate from any lofty idea of their motives to "get investor money now", this and other moves/announcements make more sense: anything that could look good to an investor.

      User count going up? Sure.

      New browser that will deeply integrate chatGPT into users lives and give OAI access to their browsing/shopping data? Sure

      Several new hardware products that are totally coming in the next several months? Sure

      We're totally going to start delivering ads? Sure

      We're making commitments to all these compute providers because our growth is totally going to warrant it? Sure

      Oh, since we're investing in all of that compute, we're also going to become a compute vendor! Sure

      None of it is particularly intentional, strategic, or sound. OAI is a money pit, they can always see the end of the runway, and must secure funding now. That is their perpetual state.