← Back to context

Comment by jimmySixDOF

2 years ago

And I strongly agree with pointing out a low hanging fruit for "good" regulation is strict and clear attribution laws to label any AI generated content with its source. That's a sooner the better easy win no brainer.

Why would we do this? And how would this conceivably even be enforced? I can't see this being useful or even well-defined past cartoonishly simple special cases of generation like "artist signatures for modalities where pixels are created."

Requiring attribution categorically across the vast domain of generative AI...can you please elaborate?

  • > Why would we do this?

    i think it's a reasonable ask to enforce attribution of AI generated content. We enforce food labels, why not content?

    I would go further and argue that AI generated content do not get granted the same copyright as human generated content, but with that, AI generated content using existing copyrighted training data does not violate copyright.

    • > We enforce food labels, why not content?

      Regulation isn't always, but often is a drag on productivity. Food labels make total sense because the negative consequences of not doing it outweight the drag of doing it.

      I'm not at all convinced that enforcing AI labeling and the resulting impossible task of policing and enforcing this will outweigh any negatives of not doing it.

      I'm thinking about the cookie policy in Europe. I hate it and almost always just click through because so many websites work around it by making it a real pain to "reject cookies".

    • If you use an AI spell checker then will your resulting text all be without copyright?

      If you use an AI coding assistant then will the written code be without copyright? Or will the code require a disclaimer that says some parts of it are AI generated?

      You're also going to have to be very precise on defining what AI means. For most people a compiler is as magical as AI. They might even consider it AI, especially if it does some kind of automatic performance optimizations - after all, that's not the behavior the user wrote.

Where is the line drawn? My phone uses math to post-process images. Do those need to be labeled? What about filters placed on photos that do the same thing? What about changing the hue of a color with photoshop to make it pop?

  • Generative AI. Anything that can create detailed content out of a broad / short prompt. This currently means diffusion for images, large language models for text. That may change as multi-modality and other developments play out in this space.

    This capability is clearly different from the examples you list.

    Just because there may be no precise engineering definition does not mean that we cannot arrive at a suitable legal/political definition. The ability to create new content out of whole cloth is quite separate from filters, cropping, and generic "pre-AI" image post-processing. Ditto for spellcheck and word processors for text.

    The line actually is pretty clear here.

    • How do you expect to regulate this and prove generative models were used? What stops a company from purchasing art from a third party where they receive a photo from a prompt, where that company isn't US based?

      28 replies →

  • Yes to all of the above, and airbrushed pictures in old magazines should have been labeled too. I'm not saying unauthorized photoediting should be a crime, but I don't see any good reason why news outlets, social media sites, phone manufacturers, etc. need to be secretive about it.

Please define "AI generated content" in a clear and legally enforceable manner. Because I suspect you don't understand basic US constitutional law including the vagueness doctrine and limits on compelled speech.