← Back to context

Comment by kouteiheika

1 day ago

So you'd prefer that only rich megacorporations and criminals have access to this technology, and not normal people and researchers?

How is that surprising? The advent of modern AI tools has resulted in most people being heavily pro-IP. Everyone now talks about who has the copyright to something and so on.

  • Yes, people are now very pro-IP because it's the big corporations that are pirating stuff and harvesting data en-masse to train their models, and not just some random teenagers in their basements grabbing an mp3 off LimeWire. So now the IP laws, instead of being draconian, are suddenly not adequate.

    But what is frustrating to me is that the second order effects of making the law more restrictive will be doing us all a big disfavor. It will not stop this technology, but it will just make it more inaccessible to normal people and put more power into the hands of the big corporations which the "they're stealing our data!" people would like to stop.

    Right now I (a random nobody) can go on HuggingFace, download model which is more powerful that anything that was available 6 months ago, and run it locally on my machine, unrestricted and private.

    Can we agree that's, in general, a good thing?

    So now if you make the model creators liable for misuse of the models, or make the models a derivative work of its training data, or anything along these lines - what do you think will happen? Yep. The model on HuggingFace is gone, and now the only thing you'll have access to is a paywalled, heavily filtered and censored version of it provided by a megacorporation, while the megacorporation itself has internally an unlimited, unfiltered access to that model.

    • >Can we agree that's, in general, a good thing?

      The models come from overt piracy, and are often used to make fake news, slander people, or other illegal content. Sure it can be funny, but the poison fruit from a poison tree is always going to be overt piracy.

      I agree research is exempt from copyright, but people cashing in on unpaid artists works for commercial purposes is a copyright violation predating the DMCA/RIAA.

      We must admit these models require piracy, and can never be seen as ethical. =3

      '"Generative AI" is not what you think it is'

      https://www.youtube.com/watch?v=ERiXDhLHxmo

  • intellectual property isn't going to save us. it's a flimsy retort, like the water usage complaints

    • This covers the data center resource green-washing rhetoric, and most taxpayers will be paying more for energy now regardless of what they think:

      '"Generative AI" is not what you think it is'

      https://www.youtube.com/watch?v=ERiXDhLHxmo

      And this paper proved the absurd outcome of the bubble is hype:

      'Researchers Built a Tiny Economy. AIs Broke It Immediately'

      https://www.youtube.com/watch?v=KUekLTqV1ME

      It is true bubbles driven by the irrational can't be stopped, but one may profit from peoples delusions... and likely get discount GPUs when the economic fiction inevitably implodes. Best of luck =3

      2 replies →

Why not? I don’t think normal people have very many good uses for deepfake tech.

  • Who is normal person? Non-creative? Deepfakes have immense creative potential.

    • I don’t really see it to be honest. I feel like their best and most natural use is scams.

      Maybe a different comparison you would agree with is Stingrays, the devices that track cell phones. Ideally nobody would have them but as is, I’m glad they’re not easily available to any random person to abuse.

The studios did already rip off Mark Hamill of all people.

Arguing regulatory capture versus overt piracy is a ridiculous premise. The "AI" firms have so much liquid capital now... they could pay the fines indefinitely in districts that constrain damages, and already settled with larger copyright holders like it was just another nuisance fee. =3