Comment by safety1st
4 days ago
This is the first time I've ever heard somebody claim that section 230 exists to deter child predators.
That argument is of course nonsense. If the platform is aware of apparent violations including enticement, grooming etc. they are obligated to report this under federal statute, specifically 18 USC 2258A. Now if you think that statute doesn't go far enough then the right thing to do is amend it, or more broadly, establish stronger obligations on platforms to report evidence of criminal behavior to the authorities. Either way Section 230 is not needed for this purpose and deterring crime is not a justification for how it currently exists.
The final proof of how nonsensical this argument is, is that even if the intent you claim was true, it failed. Facebook and Instagram are the largest platforms for groomers online. Nazi and white supremacy content are everywhere on these websites as well. So clearly Section 230 didn't work for this purpose. Zuck was happy to open the Nazi floodgates on his platforms the moment a conservative President got elected. That was all it took.
The actual problem is that Meta is a lawless criminal entity. The mergers which created the modern Meta should have been blocked in the first place. When they weren't, Zuck figured he could go ahead and open the floodgates and become the largest enabler of CSAM, smut and fraud on earth. He was right. The United States government has become weak. It doesn't protect its people. It allows criminal perverts like the board of Meta and the rest of the Epstein class to prey on its people.
Reporting blatant criminal violations is not the same thing as moderating otherwise-protected speech that could be construed as misleading, offensive, or objectionable in some other way.
Indeed. However, there is no universal definition for what offends people, and never will be. People are individuals who form their own opinions and those opinions are diverse.
Ergo if you start to moderate speech which is offensive from one point of view, it will inevitably be inoffensive to others, and you've now established that you're a publisher, not a platform, because you're making opinionated decisions about which content to publish and to whom. At that point the remedy lies in reclassifying said platform as a publisher, and revisiting how we regulate publishers.
They can be publishers. They can censor material they object to. That's fine. But they don't need special exemptions from the rules other publishers follow.
I think it's good to have publishers in the world who are opinionated. There are opinions I don't like and don't want to see very often. Where we get into trouble is when these publishers get classified as platforms by the law, claim to be politically neutral entities, and enjoy the various legal privileges assigned to platforms by Section 230 of the CDA. The purpose of that section was to encourage a nascent tech industry by assigning special privileges to the companies in it. That purpose is now obsolete, those companies are now behaving like publishers, and reform of our laws is necessary.