← Back to context

Comment by slg

6 days ago

>People like their recommendation engines.

People liked cigarettes too.

>They want Netflix to show them more similar shows.

Perhaps that example was a little too revealing on your end. Netflix doesn't have/need Section 230 protections and they're doing fine.

I'm not suggesting these algorithms should be illegal, just that Section 230 protections were defined too broadly because they predated the feasibility of these type of algorithms. These platforms would be free to continue algorithmic promotion, but I believe these algorithms would be less harmful if the platforms had to worry about potential legal liability.

Think YouTube and copyright for comparison. The DMCA is far from perfect, but we have YouTube as an example of a platform that survived and even thrived in the transition from a world that didn't care about copyrighted internet video to one in which they that needed to moderate with copyright in mind.

> People liked cigarattes too.

Cigarettes weren’t made illegal. Cigarette companies are not liable for their user’s choice to consume them. What’s your point?

> Perhaps that example was a little too revealing on your end. Netflix doesn't have/need Section 230 protections and they're doing fine.

Perhaps it was a little too revealing on your end that you conveniently ignored my other example of Reddit.

If you need to cherry pick to make your point it doesn’t look very strong.

I still don’t see consistency in your argument that Section 230 should still apply to Hacker News but not, for example, Reddit, simply because one of them allows users to personalize the content they see.

  • > Cigarette companies are not liable for their user’s choice to consume them.

    They kind of were. Not completely liable, but partially. Because... um, well, uh, yeah, they are. They are literally liable.

    If you produce cigarettes, you are partially responsible for people smoking. Smoking is also not a "choice", come on now. The only people who believe that are people trying to sell you cigarettes or people who have never smoked.

    That's why you can't advertise cigarettes anywhere anymore and they're very hard to find. And, when you do find them, the box tells you "hey please don't smoke this". R.J. Reynolds didn't do that by fucking choice, we forced them.

    • > They kind of were. Not completely liable, but partially. Because... um, well, uh, yeah, they are. They are literally liable.

      Cigarette companies are not legally liable for the consequences their users encounter.

      It’s really hard to have an actual discussion about anything when people are just making up their own definitions.

      3 replies →

  • This is the type of comment that suggests you aren't engaging with what I'm saying beyond a superficial level. My argument is consistent. I'm not cherry-picking examples. The differentiator I'm criticizing is the personalized nature of the algorithms. But rather than engaging with the merit of that distinction, you're acting as if there is no distinction at all. I'm not sure if there is much point in contuning the conversation from there.

    • I think the other person's issue with your position is that the distinction is entirely arbitrary. You're not giving any reasons why the demarcation line for which feed algorithms are OK and which are not is there instead of anywhere else. It seems to be just "Facebook and TikTok are bad; Their feeds are personalized recommendation engines; Therefore personalized recommendation engines are bad, and other feed algorithms are OK".

      14 replies →

  • Of course Section 230 would apply to both sites, but only to the user-generated part of each site, because that's what Section 230 says.