Comment by fullshark
4 days ago
Treat algorithmic feeds as "publications" by machines. Treat these social media companies as publishers and allow them to be sued for libel, with damage amounts based on reach.
If there's no algorithmic feed and the company is truly just a self publishing utility then keep the section 230 protections
Yup, I absolutely don't understand how they're able to get away with choosing material to promote and then not call themselves publishers.
They're acting as editors for a publication. Hold them accountable like the publication companies they are.
Want to continue getting safe-harbor exemptions for user submitted content? No fucking algorithmically chosen feeds.
CDA 230 was written specifically to overturn a defamation ruling that held online platforms responsible for content; this was specifically a result of Jordan Belfort - the Wolf of Wall Street - suing to censor negative opinions of his fraudulent investment offerings.
Prior to that lawsuit, the existing law regarding defamation was that you could hold a newspaper accountable for what they had printed, but not the newsstand selling the newspaper. The courts in the Jordan Belfort cases decided to categorize online services based on their moderation policy: if you published literally anything sent to you, you were the newsstand[0]; if you decided not to publish certain things then you were a newspaper.
In case it isn't obvious, this is an unacceptable legal precedent for running any sort of online service. The only services that you could legally run would either be the most free-wheeling; or the most censurious, where everything either has to be pre-checked by a team of lawyers for risk and only a small amount of speech ever gets published, or everything gets published, including spam and bullshit.
To make things worse, there is also standing precedent in Mavrix v. LiveJournal regarding DMCA safe harbor[1] that the use of human curation or moderation strips you of your copyright safe harbor. The only thing DMCA 512 protects is machine-generated feeds (algorithmic or chronological).
So let's be clear: removing CDA 230 safe harbor from a feature of social media you don't like doesn't mean that feature goes away. It means that feature gets more and more censored by the whims of whatever private citizens decide to sue that day. The social media companies are not going to get rid of algorithmic feeds unless you explicitly say "no algorithmic feeds", because those feeds make the product more addictive, which is how they make money.
The "slop trough" design of social media is optimal for profit because of a few factors; notably the fact that social media companies have monopolistic control over the client software people use. Even browser extensions intended to hide unwanted content on Facebook have to endure legal threats, because Facebook does not want you using their service as anything other than a slop trough.
So if you want to kill algorithmic feeds, what you want to do is kill Facebook's control over Facebook. That means you want legal protections for third-party API clients, antitrust scrutiny on all social media platforms, and legally mandated interoperability so that when a social media platform decides to turn into a slop trough, anyone so interested can just jump ship to another platform without losing access to their existing friends.
[0] Ignore the fact that this is not how newsstands work. You can't go to any newsstand, put your zine on it, and demand they sell it or face defamation risk.