← Back to context

Comment by hrimfaxi

3 months ago

Why should the user be liable? They didn't reproduce the copyrighted work and the machine is totally capable of denying output (like it already does for other categories of material).

At the very least, the users being liable instead of OpenAI makes no sense. Like arresting only drug users and not dealers.

There are countries where drug consumption/posesion is penalized too. There is a similar example in other area: For instance, in Sweeden, Norway and Belize selling sex (aka prostitution) is legal, but buying it is not legal. So, your example actually exists in world legislation.

I'm just asking where are we going to put the line and why.

  • You had originally said the user should be liable instead of OpenAI being liable.

    > However, the lyrics are shown because the user requested them, shouldn't be the user be liable instead?

    I would imagine the sociological rationale for allowing sex work would not map to a multi-billion-dollar company.

    And to add, the social network example doesn't map because the user is producing the content and sharing it with the network. In OpenAI's case, they are creating and distributing copyrighted works.

    • No, the edited wording still conveys the same meaning. My edit was to fix another grammar typo.

      The social networks are distributing such content AND benefiting from selling ads on them. Adding ads on top is a derivative work.

      Personally I'm on the side of penalizing the side that provides the input, not the output:

      - OpenAI training on copyrighted works. - Users requesting custom works based on copyrighted IP

      That is my opinion on how it should be layered, that's it. I'm happy to discuss why it should be that way or why not. As I put in other comment, my concern is that mandating copyright filtering o each generative tool would end up propagating to every single digital tool, which as society we don't really want.

      1 reply →