Comment by staticman2
11 days ago
Sorry, this doesn't make sense to me.
Any human contributor can also plagiarize closed source code they have access to. And they cannot "transfer" said code to an open source project as they do not own it. So it's not clear what "elephant in the room" you are highlighting that is unique to A.I. The copyrightability isn't the issue as an open source project can never obtain copyright of plagiarized code regardless of whether the person who contributed it is human or an A.I.
a human can still be held accountable though, github copilot running amock less so
If you pay for Copilot Business/Enterprise, they actually offer IP indemnification and support in court, if needed, which is more accountability than you would get from human contributors.
https://resources.github.com/learn/pathways/copilot/essentia...
I think that they felt the need to offer such a service says everything, basically admitting that LLMs just plagiarize and violate licenses.
1 reply →
9 lines of code came close to costing Google $8.8 billion
how much use do you think these indemnification clauses will be if training ends up being ruled as not fair-use?
3 replies →
That covers any random contribution claiming to be AI?
1 reply →
Human beings can create copyrightable code.
As per the US Copyright Office, LLMs can never create copyrightable code.
Humans can create copyrightable code from LLM output if they use their human creativity to significantly modify the output.