Comment by _trampeltier
7 hours ago
It is very different. It is YOUR 3d printer, no one else is involved. You might print a knife and kill somebody with it, you go to jail, not third party involved.
If you use a service like Grok, then you use somebody elses computer / things. X is the owner from computer that produced CP. So of course X is at least also a bit liable for producing CP.
How does that mesh with all the safe harbour provisions we've depended on to make the modern internet, though?
The safe harbor provisions largely protect X from the content that the users post (within reason). Suddenly Grok/X were actually producing the objectionable content. Users were making gross requests and then an LLM owned by X, using X servers and X code would generate the illegal material and then post it to the website. The entity responsible is no longer done user but instead the company itself.
So, if someone hosts an image editor as web app, are they liable if someone uses that editor to create CP?
I honestly don't follow it. People creating nudes of others and using the Internet to distribute it can be sued for defamation, sure. I don't think the people hosting the service should be liable themselves, just like people hosting Tor nodes shouldn't be liable by what users of the Tor Network do.
Yes, and that was a very stupid product decision. They could have put the image generation into the post editor, shifting responsibility to the users.
I'd guess Elon is responsible for that product decision.
Note that is a US law, not a French one.
Also, safe harbor doesn't apply because this is published under the @grok handle! It's being published by X under one of their brand names, it's absurd to argue that they're unaware or not consenting to its publication.
It's not like the world benefited from safe harbor laws that much. Why don't just amend them so that algorithms that run on server side and platforms that recommend things are not eligible.
If you are thinking about section 230 it only applies to user–generated content, so not server–side AI or timeline algorithms.
2 replies →
Before a USER did create content. So the user was / is liable. Now a LLM owned by a company does create content. So the company is liable.
I'm not trying to make excuses for Grok, but how exactly isn't the user creating the content? Grok doesn't have create images on its own volition, the user is still required to give it some input, therefore "creating" the content.
3 replies →
This might be an unpopular opinion but I always thought we might be better off without Web 2.0 where site owners aren’t held responsible for user content
If you’re hosting content, why shouldn’t you be responsible, because your business model is impossible if you’re held to account for what’s happening on your premises?
Without safe harbor, people might have to jump through the hoops of buying their own domain name, and hosting content themselves, would that be so bad?
Any app allowing any communication between two users would be illegal.
2 replies →
What about webmail, IM, or any other sort of web-hosted communication? Do you honestly think it would be better if Google were responsible for whatever content gets sent to a gmail address?
3 replies →
You know this site would not be possible without those protections, right?