Comment by hellojesus
2 years ago
Where is the line drawn? My phone uses math to post-process images. Do those need to be labeled? What about filters placed on photos that do the same thing? What about changing the hue of a color with photoshop to make it pop?
Generative AI. Anything that can create detailed content out of a broad / short prompt. This currently means diffusion for images, large language models for text. That may change as multi-modality and other developments play out in this space.
This capability is clearly different from the examples you list.
Just because there may be no precise engineering definition does not mean that we cannot arrive at a suitable legal/political definition. The ability to create new content out of whole cloth is quite separate from filters, cropping, and generic "pre-AI" image post-processing. Ditto for spellcheck and word processors for text.
The line actually is pretty clear here.
How do you expect to regulate this and prove generative models were used? What stops a company from purchasing art from a third party where they receive a photo from a prompt, where that company isn't US based?
> How do you expect to regulate this and prove generative models were used?
Disseminating or creating copies of content derived from generative models without attribution would open that actor up to some form of liability. There's no need for onerous regulation here.
The burden of proof should probably lie upon whatever party would initiate legal action. I am not a lawyer, and won't speculate further on how that looks. The broad existing (and severely flawed!) example of copyright legislation seems instructive.
All I'll opine is that the main goal here isn't really to prevent Jonny Internet from firing up llama to create a reddit bot. It's to incentivize large commercial and political interests to disclose their usage of generative AI. Similar to current copyright law, the fear of legal action should be sufficient to keep these parties compliant if the law is crafted properly.
> What stops a company from purchasing art from a third party where they receive a photo from a prompt, where that company isn't US based?
Not really sure why the origin of the company(s) in question is relevant here. If they distribute generative content without attribution, they should be liable. Same as if said "third party" gave them copyright-violating content.
EDIT: I'll take this as an opportunity to say that the devil is in the details and some really crappy legislation could arise here. But I'm not convinced by the "It's not possible!" and "Where's the line!?" objections. This clearly is doable, and we have similar legal frameworks in place already. My only additional note is that I'd much prefer we focus on problems and questions like this, instead of the legislative capture path we are currently barrelling down.
27 replies →
Yes to all of the above, and airbrushed pictures in old magazines should have been labeled too. I'm not saying unauthorized photoediting should be a crime, but I don't see any good reason why news outlets, social media sites, phone manufacturers, etc. need to be secretive about it.
But how on earth is that helpful for consumers?
It's helpful because they know more about what they're looking at, I guess? I'm a bit confused by the question - why wouldn't consumers want to know if a photo they're looking at had a face-slimming filter applied?
8 replies →