Comment by orf

4 days ago

They are asking me to author my contributions in a way that they approve of. The essence of the request is the same as asking someone to author them whilst standing on their head.

Except they don’t, won’t and can’t control that: the very request is insulting.

I’ll make a change any way I choose, upright, sideways, using AI. My choice. Not theirs.

Their choice is to accept it or reject it based purely on the change itself, because that’s all there is.

If you’re going to lie and say there was no LLM involved, what else are you going to lie about? Copying code from another codebase with incompatible license terms, perhaps?

I would say people should be wary of any contributions whatsoever from a filthy fucking liar.

  • > what else are you going to lie about?

    Nothing? Everything? Does it fucking matter? Assigning trust across a boundary like this is stupid, and that’s my point.

    Oh, would you just accept my blatantly, verbatim copied-from-another-codebase-and-relicensed PR just because I said “I solemnly swear this is not blatantly, verbatim copied from another codebase and relicensed”?

    That’s on you for stupidly assigning any trust to the author of the change. It’s the internet: nobody knows you’re a dog.

    • > Oh, would you just accept my blatantly, verbatim copied-from-another-codebase-and-relicensed PR just because I said “I solemnly swear this is not blatantly, verbatim copied from another codebase and relicensed”?

      At that point you've proven intention, meaning you'll get the chance to argue your viewpoint in front of a judge.

      12 replies →

So, "might makes right", essentially?

  • No, just a normal reaction to someone trying to force their beliefs on you.

    • Instead of arguing for violating the boundaries of a "slow, bespoke" no-LLM project, you can simply start one that enjoys all the benefits of LLMs by NOT having that boundary. Very simple solution.