Comment by sarchertech
2 days ago
That’s the difference. In practice a human has to commit fraud to do this.
But a human just using an LLM to generate code will do it accidentally. The difference is that regurgitation of training text is a documented failure mode of LLMs.
And there’s no way for the human using it to be aware it’s happening.
You can not accidentally sign your name saying “this code is GPL compliant”
If you can’t be sure, don’t sign.
I’m not gonna. A lot of other people now will.
Well yes, people break the law and expose themselves to liability everyday. Nothing new there.