Comment by javascriptfan69

4 days ago

I genuinely think we will look back at the algorithmic content feed as being on par with leaded gasoline or cigarettes in terms of societal harm.

Maybe worse since it is engineered to be as addictive as possible down to an individual level.

Then again maybe I'm being too optimistic that it will be fixed before it destroys us.

I think it's worse, cigarettes never threatened democracy

the solution is real easy, section 230 should not apply if there's an recommendation algorithm involved

treat the company as a traditional publisher

because they are, they're editorialising by selecting the content

vs, say, the old style facebook wall (a raw feed from user's friends), which should qualify for section 230

  • > cigarettes never threatened democracy

    Off topic, but I bet a book on tobacco cultivation/history would be fascinating. Tobacco cultivation relied on the slave labor of millions and the global tobacco market influenced Jefferson and other American revolutionaries (who were seeing their wealth threatened). I've also read that Spain treated sharing seeds as punishable by death? The rare contrast that makes Monsanto look enlightened!

    • Mm, definitely. I think it's probably the cash crop that has historically been the most intertwined with politics, even more so than sugar.

      Central America, the Balkans, the Levant. The Iroquois and Algonquians. Cuba. The Medicis and the Stuarts. And, as you say, revolutionary Virginia and Maryland. Lots of potential there for a grand narrative covering 600 years or more!

      (And, to gp: yes, it absolutely did threaten governments, empires, and entire political systems!)

      8 replies →

    • Something like The Prize for the tobacco industry could be very interesting!

  • The problem with this is that section 230 was specifically created to promote editorializing. Before section 230, online platforms were loath to engage in any moderation because they feared that a hint of moderation would jump them over into the realm of "publisher" where they could be held liable for the veracity of the content they published and, given the choice between no moderation at all or full editorial responsibility, many of the early internet platforms would have chosen no moderation (as full editorial responsibility would have been cost prohibitive).

    In other words, that filter that keeps Nazis, child predators, doxing, etc. off your favorite platform only exists because of section 230.

    Now, one could argue that the biggest platforms (Meta, Youtube, etc.) can, at this point, afford the cost of full editorial responsibility, but repealing section 230 under this logic only serves to put up a barrier to entry to any smaller competitor that might dislodge these platforms from their high, and lucrative, perch. I used to believe that the better fix would be to amend section 230 to shield filtering/removal, but not selective promotion, but TikTok has shown (rather cleverly) that selective filtering/removal can be just as effective as selective promotion of content.

    • Platforms routinely underinvest in trust and safety.

      T&S is markedly more capable in the dominant languages (English is ahead by far).

      Platforms make absurd margins when compared to any other category of enterprise known to man.

      They operate at scales where a 0.001% error rate is still far beyond human capability to manually review.

      Customer support remains a cost center.

      Firms should be profitable and have a job to do.

      We do not owe them that job. Firms are vehicles to find the best strategies and tactics given societal resources and goals.

      If rules to address harms result in current business models becoming unviable, then this is not a defense of the current business model.

      Currently we are socializing costs and privatizing profit.

      Having more customer support, more transparency, and more moderation will be a cost of doing business.

      Our societies have more historical experience thinking about government capture than flooding the zone style private capture of speech.

      America developed the FDA and every country has rules on how hygiene should be maintained in food.

      People still can start small, and then create medium or large businesses. Regulation is framed for the size of the org.

      Many firms fail - but failure and recreation are natural parts of the business cycle.

    • This is the first time I've ever heard somebody claim that section 230 exists to deter child predators.

      That argument is of course nonsense. If the platform is aware of apparent violations including enticement, grooming etc. they are obligated to report this under federal statute, specifically 18 USC 2258A. Now if you think that statute doesn't go far enough then the right thing to do is amend it, or more broadly, establish stronger obligations on platforms to report evidence of criminal behavior to the authorities. Either way Section 230 is not needed for this purpose and deterring crime is not a justification for how it currently exists.

      The final proof of how nonsensical this argument is, is that even if the intent you claim was true, it failed. Facebook and Instagram are the largest platforms for groomers online. Nazi and white supremacy content are everywhere on these websites as well. So clearly Section 230 didn't work for this purpose. Zuck was happy to open the Nazi floodgates on his platforms the moment a conservative President got elected. That was all it took.

      The actual problem is that Meta is a lawless criminal entity. The mergers which created the modern Meta should have been blocked in the first place. When they weren't, Zuck figured he could go ahead and open the floodgates and become the largest enabler of CSAM, smut and fraud on earth. He was right. The United States government has become weak. It doesn't protect its people. It allows criminal perverts like the board of Meta and the rest of the Epstein class to prey on its people.

      2 replies →

    • Even if they can't afford it... Too bad for them?

      I am kind of rooting for the AI slop because the status quo is horrific, maybe the AI slop cancer will put social media out of its misery.

      1 reply →

    • Section 230 being repealed doesn't mean that any moderation will be treated as publication. The ambient assumptions have changed a lot in the past 30 years. Now nobody would think that removing spam makes you liable as a publisher.

      Algorithmic feeds are, prima facie, not moderation, not user-created content and do not fall under the purview of section 230.

      We all know why they're really doing it, though.

  • > As interpreted by some courts, this language preserves immunity for some editorial changes to third-party content but does not allow a service provider to "materially contribute" to the unlawful information underlying a legal claim. Under the material contribution test, a provider loses immunity if it is responsible for what makes the displayed content illegal.[1]

    I'm not a lawyer, but idk that seems pretty clear cut. If you, the provider, run some program which does illegal shit then 230 don't cover your ass.

    [1] https://www.congress.gov/crs-product/IF12584

  • THIS, EXACTLY!

    If there is an algorithm, the social media platform is exactly as responsible for the content as any publisher

    If it is only a straight chronological feed of posts by actually followed accts, the social media platform gets Section 230 protections.

    The social media platforms have gamed the law, gotten legitimate protections for/from what their users post, but then they manipulate it to their advantage more than any publisher.

    >>the solution is real easy, section 230 should not apply if there's an recommendation algorithm involved

    >>treat the company as a traditional publisher

    >>because they are, they're editorialising by selecting the content

    >>vs, say, the old style facebook wall (a raw feed from user's friends), which should qualify for section 230

  • You can draw a fairly clear line from the corporate response to cigarettes being regulated through to the strategy for climate change and social media/crypto etc.

    The Republicans are basically a coalition of corporate interests that want to get you addicted to stuff that will make you poor and unhealthy, and underling any collective attempt to help.

    The previous vice-president claimed cigarettes don't give you cancer and the current president thinks wind turbine and the health problems caused by asbestos are both hoaxes. This is not a coincidence.

    The two big times the Supreme Court flexed their powers were to shut down cigarette regulation by the FDA and Obama's Clean Power plan. Again, not a coincidence.

    • That's because we / our (USA) country is owned. As Carlin said, "It's a big club. And you ain't in it."[0]

      But what isn't properly addressed when people link to this is that the real issue he's discussing is our failing educational system. It's not a coincidence that the Right attacks public schools and the orange man appointed a wrestling lady to dismantle the dept of education.[1]

      0. https://www.youtube.com/watch?v=sNXHSMmaq_s

      1. The Trump Administration Plot to Destroy Public Education - https://prospect.org/2026/01/13/trump-mcmahon-department-edu...

      Aside: I was in the audience for this show (his last TV special). Didn't know it'd be shot for TV. Kind of sucked, actually, cause they had lights on the audience for the cameras and one was right in my eyes. Anyway, a toast to George Carlin who was ahead of his time and would hate how right he's been.

  • They fought a civil war over the labor required to produce tobacco.

    • > cigarettes never threatened democracy

      "Democracy" itself was not at stake in the American Civil War because both sides practiced it. The Confederacy was/would have been a democracy analogous to ancient Athens--one where slaves (and women) were excluded from political participation. The vast majority of Confederate politicians, including Jefferson Davis, came from the "Democratic Party"--which, true to its name, championed enfranchisement for the "common (white) man" as opposed to control by elites.

      Perhaps a better example is the "Tobacco War" of 1780 in the American Revolution, where Cornwallis and Benedict Arnold destroyed massive quantities of cured tobacco to try to cripple the war financing of the colonies.

      Control of tobacco in Latin/South America since the 1700s (Spain's second-largest source of imperial revenue after precious metals) also had a directly stifling effect on democratic self-governance.

      1 reply →

  • Social media cannot "threaten democracy". Democracy means that we transfer power to those who get the most votes.

    There's nothing more anti-democratic than deciding that some votes don't count because the people casting them heard words you didn't like.

    The kind of person to whom the concept of feed ranking threatening democracy is even a logical thought believes the role of the public is to rubber stamp policies a small group decides are best. If the public hears unapproved words, it might have unapproved thoughts, vote for unapproved parties, and set unapproved policy. Can't have that.

    • That trivial definition sees limited use in the real world. Few countries that are popularly considered democratic have direct democracy. Most weigh votes geographically or use some sort of representative model.

      Most established definitions of democracy goes something like, heavily simplified:

      1. Free media

      2. Independent judicial system

      3. Peaceful system for the transfer of power

      The most popular model for implementing (3) is free and open elections, which has yielded pretty good results in the past century where it has been practiced.

      Considering social media pretty much is media for most, it is a heavily concentrated power, and if there can any suspicions of being in cahoots with established political power and thus non-free, surely that is a threat to democracy almost by definition.

      Let's be real here: It has been conclusively shown again and again that social media does influence elections. That much should be obvious without too much in the way of academic rigor.

      4 replies →

  • > never threatened democracy

    The beautiful part is how non-partisan this is. It cooks all minds regardless of tribe.

  • Why change section 230? You can just make personalized algorithmic feeds optimized for engagement illegal instead, couldn't you? What advantage does it have to mess with 230, wouldn't the result be the same in practice?

    • 230 is an obvious place to say “if you decide something is relevant to the user (based on criteria they have not explicitly expressed to you), then you are a publisher of that material and are therefore not a protected carriage service.

    • The solution must be a social one: we must culturally shun algorithmic social media, scold its proponents, and help the addicted.

      We aren't going to be able to turn off the AI content spigot or write laws that control media format and content and withstand (in the US) 1st amendment review. But we can change the cultural perception.

      4 replies →

  • If your tree is so weak that a single breeze can knock it off, why blame the wind? Disclaimer: I hate social media of all kinds, it's just that you're missing the forest.

    • The force of social media these past 20 years has been massive. We're talking radical change to the structure of information flow in society. That's not just a small breeze.

    • The breeze is more like a 2 ton harvester expertly engineered to knock your tree down.

> we will look back at the algorithmic content feed as being on par with leaded gasoline or cigarettes in terms of societal harm

I agree 100%.

However, I think the core issue is not the use of an algorithm to recommend or even to show stuff.

I think the issue is that the algorithm is optimized for the interests of a platform (max engagement => max ad revenue) and not for the interests of a user (happiness, delight, however you want to frame it).

And there's way too much of this, everywhere.

  • We live in a society that only values money so why should anyone optimise for s.th. else?

    • This frames society as some exogenous entity that we have no influence over.

      It also assumes that the society is homogenous, in the sense that everyone cares about the same thing. I don't think that's true at all.

      1 reply →

If anything the algorithmic dopamine drip is just getting started. We haven't even entered the era of intensely personalized ai-driven individual influence campaigns. The billboard is just a billboard right now, but it won't be long before the billboard knows the most effective way to emotionally influence you and executes it perfectly. The algorithm is mostly still in your phone.

That's not where it stops.

It’s crazy (but true) to think that by slowly manipulating someone’s feed, Zuck and Musk could convert people’s religions, political leanings, personal values, etc with little work. In fact, I would be surprised if there was NOT some part of Facebook and Twitter’s admin or support page where a user’s “preferences” could be modified i.e “over the next 8 months, convert the user to a staunch evangelical Christian” etc

Yeah might not ever get fixed. It is the perfect tool for mass influence and surveillance of the people. The powers that he would never let it go