Comment by palmotea
6 months ago
> Simply, if the models can think then it is no different than a person reading many books and building something new from their learnings.
No, that's fallacious. Using anthropomorphic words to describe a machine does not give it the same kinds of rights and affordances we give real people.
The judge did use some language that analogized the training with human learning. I don't read it as basing the legal judgement on anthropomorphizing the LLM though, but rather discussing whether it would be legal for a human to do the same thing, then it is legal for a human to use a computer to do so.
[1] https://authorsguild.org/app/uploads/2025/06/gov.uscourts.ca...
Yeah I see the point, but is still thing there is a differnce between human learning and machine learning creatively, see my post above connected to the parent.
Actually, it does, at least for this case. The judge just said so.
People have rights, machines don't. Otherwise, maybe give machines the right to vote, for example?...
This case is more like:
If a human uses a voting machine, they still have a right to vote.
Machines don't have rights. The human using the machine does.
If I can use my brain to learn, I as a human can use my computer to learn.
Its like, taking notes, or google image search caching thumbnails. Honestly we dont even need the learning metaphor to see this is obviously not an infringement.
2 replies →