Comment by sarchertech
13 hours ago
Sure we could change the law. It would be a stupid change to allow individuals, organizations, and companies to completely shield themselves from the consequences of risky behaviors (more than we already do) simply by assigning all liability to a fall guy.
What law exactly are you suggesting needs to be changed? How is this any different from what already happens right now, today?
Right now it's very easy not to infringe on copyrighted code if you write the code yourself. In the vast majority of cases if you infringed it's because you did something wrong that you could have prevented (in the case where you didn't do anything wrong, inducement creation is an affirmative defense against copyright infringement).
That is not the case when using AI generated code. There is no way to use it without the chance of introducing infringing code.
Because of that if you tell a user they can use AI generated code, and they introduce infringing code, that was a foreseeable outcome of your action. In the case where you are the owner of a company, or the head of an organization that benefits from contributors using AI code, your company or organization could be liable.
So it's a bit as if Linux Organization told its contributors you can bring in infringing code but you must agree you are liable for any infringement?
But if a lawsuit was later brought who would be sued? The individual author or the organization? In other words can an organization reduce its liability if it tells its employees "You can break the law as long as you agree you are solely responsible for such illegal actions?
It would seem to me that the employer would be liable if they "encourage" this way of working?
It’s a foreseeable outcome that humans might introduce copyrighted code into the kernel.
I think you’re looking for problems that don’t really exist here, you seem committed to an anti AI stance where none is justified.
1 reply →
> Right now it's very easy not to infringe on copyrighted code if you write the code yourself.
Humans routinely produce code similar to or identical to existing copyrighted code without direct copying.
3 replies →
In this case, the "fall guy" is the person who actually introduced the code in question into the codebase.
They wouldn't be some patsy that is around just to take blame, but the actual responsible party for the issue.
Imagine your a factory owner and you need a chemical delivered from across the country, but the chemical is dangerous and if the tanker truck drives faster than 50 miles per hour it has a 0.001% chance per mile of exploding.
You hire an independent contractor and tell him that he can drive 60 miles per hour if he wants to but if it explodes he accepts responsibility.
He does and it explodes killing 10 people. If the family of those 10 people has evidence you created the conditions to cause the explosion in order to benefit your company, you're probably going to lose in civil court.
Linus benefits from the increase velocity of people using AI. He doesn't get to put all the liability on the people contributing.
Cool analogy! Which has nothing to do with the topic in hand.
That is a nonsensical analogy on multiple levels, and doesn't even support your own argument.
2 replies →