Comment by latexr

7 months ago

Ah, yes, “just” run every comment from 275k users through an error prone system, while paying for every API call, to host something they already do at a loss.

Ah yes, "just" run every new comment through an AI system which costs peanuts for a binary response, and then covers them for having a moderation policy.

  • > which costs peanuts

    Peanuts aren’t free. Buying many peanuts adds up. And again, they are already operating at a loss. Additionally, it does not cover them when the system inevitably makes a mistake, especially considering that the OP’s fear is precisely that a disgruntled user would target them, meaning it would be a matter of time before someone would bypass the LLM.

    • Bypassing, errors, etc is all process stuff that can be explained to authorities. And what do the cheap AI models cost, like $5 for a few million tokens? Hardly going to break the bank.

      Still an overreaction.