Comment by codedokode
6 months ago
It is not wrong at all. The author decides what to do with their work. AI companies are rich and can simply buy the rights or hire people to create works.
I could agree with exceptions for non-commercial activity like scientific research, but AI companies are made for extracting profits and not for doing research.
> AI companies shouldn't pirate, but if they pay for your work, they should be able to use it however they please, including training an LLM on it.
It doesn't work this way. If you buy a movie it doesn't mean you can sell goods with movie characters.
> then you have not been harmed.
I am harmed because less people will buy the book if they can simply get an answer from LLM. Less people will hire me to write code if an LLM trained on my code can do it. Maybe instead of books we should start making applications that protect the content and do not allow copying text or making screenshots. ANd instead of open-source code we should provide binary WASM modules.
If you reproduce the material from a work you've purchased then of course you're in violation of copyright, but that's not what an LLM does (and when it does I already conceded it's in violation and should be stopped). An LLM that doesn't "sell goods with movie characters" is not in violation.
And the harm you describe is not a recognized harm. You don't own information, you own creative works in their entirety. If your work is simply a reference, then the fact being referenced isn't something you own, thus you are not harmed if that fact is shared elsewhere.
It is an abuse of the courts to attempt to prevent people who have purchased your works from using those works to train an LLM. It's morally wrong.
> It is worse than ineffective; it is wrong too, because software developers should not exercise such power over what users do. Imagine selling pens with conditions about what you can write with them; that would be noisome, and we should not stand for it. Likewise for general software. If you make something that is generally useful, like a pen, people will use it to write all sorts of things, even horrible things such as orders to torture a dissident; but you must not have the power to control people's activities through their pens. It is the same for a text editor, compiler or kernel.
Sorry for the long quote, but basically this, yeah. A major point of free software is that creators should not have the power to impose arbitrary limits on the users of their works. It is unethical.
It's why the GPL allows the user to disregard any additional conditions, why it's viral, and why the FSF spends so much effort on fighting "open source but..." licenses.
To load a printed book into a computer one has to reproduce it in digital form without authorization. That's making a copy.
Making a digital copy of a physical book is fair use under every legal structure I am aware of.
When you do it for a transformative purpose (turning it into an LLM model) it's certainly fair use.
But more importantly, it's ethical to do so, as the agreement you've made with the person you've purchased the book from included permission to do exactly that.
1 reply →
> Maybe instead of books we should start making applications that protect the content and do not allow copying text or making screenshots.
https://en.wikipedia.org/wiki/Analog_hole
That would be "circumvention of DRM".