Comment by tantalor

7 months ago

The analogy to training is not writing a play based on the work. It's more like reading (experiencing) the work and forming memories in your brain, which you can access later.

I'm allowed to hear a copyrighted tune, and even whistle it later for my own enjoyment, but I can't perform it for others without license.

This is nonsense, in my opinion. You aren't "hearing" anything. You are literally creating a work, in this case, the model, derived from another work.

People need to stop anthropomorphizing neural networks. It's a software and a software is a tool and a tool is used by a human.

  • Humans are also created/derived from other works, trained, and used as a tool by humans.

    It's interesting how polarizing the comparison of human and machine learning can be.

  • It is easy to dismiss, but the burden of proof would be on the plaintiff to prove that training a model is substantially different than the human mind. Good luck with that.

    • That makes no sense as a default assumption. It's like saying FSD is like a human driver. If it's a person, why doesn't it represent itself in court? What wages is it being paid? What are the labor rights of AI? How is it that the AI is only human-like when it's legally convenient?

      What makes far more sense is saying that someone, a human being, took copyrighted data and fed it into a program that produces variations of the data it was fed. This is no different from a photoshop filter, and nobody would ever need to argue in court that a photoshop filter is not a human being.