← Back to context

Comment by Aachen

19 days ago

Energy usage is another. What would happen to world power consumption if 1% of WhatsApp chats would be fed to ChatGPT?

A third reason besides privacy would be the purpose. Is the purpose generating automatic replies? Or automatic summaries because the recipient can't be bothered to read what I wrote? That would be a dick move and a good reason to object as well, in my opinion

> What would happen to world power consumption if 1% of WhatsApp chats would be fed to ChatGPT?

The same thing that happens now, when 100% of power consumption is fed to other purposes. What's the problem with that?

  • Huh? It's additional power draw in the midst of an energy transition. It's not currently being used differently. What do you mean what's the problem with that?

    Also don't forget it's just one of three aspects I can think of off the top of my head. This isn't the only issue with LLMs...

    Edit: while typing this reply, I remembered a fourth: I've seen many people object morally/ethically to the training method in terms of taking other people's work for free and replicating it. I don't know how I stand on that one myself yet (it is awfully similar to a human learning and replicating creatively, but clearly on an inhuman scale, so idk) but that's yet another possible reason not to want this

    • If people need additional power, they pay for it. If they want to pay for extra power, why would we gatekeep whether their need is legitimate or not?

      2 replies →