Comment by softwaredoug
5 months ago
I'm very pro-vaccines, I don't think the 2020 election was stolen. But I think we have to realize silencing people doesn't work. It just causes the ideas to metastasize. A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.
The more important point (and this is really like a high school civics debate) is that the government and/or a big tech company shouldn't decide what people are "allowed" to say. There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think. People seem to forget that sometimes someone they don't agree with is in power. What if they started banning tylenol-autism sceptical accounts?
> the government and/or a big tech company shouldn't decide what people are "allowed" to say.
That "and/or" is doing a lot of work here. There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.
Then again, Alphabet is now claiming they did want to host it and mean old Biden pressured them into pulling it so if we buy that, maybe it doesn't matter.
> What if they started banning tylenol-autism sceptical accounts?
What if it's pro-cannibalism or pedophilia content? Everyone has a line, we're all just arguing about where exactly we think that line should be.
> There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.
It really depends. I remember after the Christchurch mosque shootings, there was a scramble to block the distribution of the shooter's manifesto. In some countries, the government could declare the content illegal directly, but in others, such as Australia, they didn't have pre-existing laws sufficiently wide to cover that, and so what happened in practice is that ISPs "proactively" formed a voluntary censorship cartel, acting in lockstep to block access to all copies of the manifesto, while the government was working on the new laws. If the practical end result is the same - a complete country block on some content - does it really matter whether it's dressed up as public or private censorship?
And with large tech companies like Alphabet and Meta, it is a particularly pointed question given how much the market is monopolized.
2 replies →
It can simultaneously be legal/allowable for them to ban speech, and yet also the case that we should criticize them for doing so. The first amendment only restricts the government, but a culture of free speech will also criticize private entities for taking censorious actions. And a culture of free speech is necessary to make sure that the first amendment is not eventually eroded away to nothing.
28 replies →
The line should be what is illegal, which, at least in the US, is fairly permissive.
The legal process already did all the hard work of reaching consensus/compromise on where that line is, so just use that. At least with the legal system, there's some degree of visibility and influence possible by everyone. It's not some ethics department silently banning users they don't agree with.
1 reply →
The middle ground is when a company becomes a utility. The power company can't simply disconnect your electricity because they don't feel like offering it to you, even though they own the power lines. The phone company can't disconnect your call because they disagree with what you're saying, even though they own the transmission equipment.
The thing is that people will tell you it wasn’t actually censorship because for them it was only the government being a busy body nosey government telling the tech corps about a select number of people violating their terms (nudge nudge please do something)… so I think the and/or is important.
3 replies →
There's a literal world of literature both contemporary and classical which points to the idea that concentrations of power in politics and concentrations of wealth and power in industry aren't dissimilar. I think there are limits to this as recent commentaries by guys like Zizek seem to suggest that the, "strong Nation-State" is a positive legacy of the European enlightenment. I think this is true, "when it is."
Power is power. Wealth is power. Political power is power. The powerful should not control the lives or destinies of the less powerful. This is the most basic description of contemporary democracy but becomes controversial when the Randroids and Commies alike start to split hairs about how the Lenins and John Galts of the world have a right to use power to further their respective political objectives.
https://www.gutenberg.org/files/3207/3207-h/3207-h.htm (Leviathan by Hobbes)
https://www.gutenberg.org/ebooks/50922 (Perpetual Peace by Kant)
https://www.heritage-history.com/site/hclass/secret_societie...
> the government and/or a big tech company shouldn't decide what people are "allowed" to say
This throws out spam and fraud filters, both of which are content-based moderation.
Nobody moderates anything isn’t unfortunately a functional option. Particularly if the company has to sell ads.
As with others, I think your "and/or" between government and "big tech" is problematic.
I think government censorship should be strictly prohibited. I think "company" censorship is just the application of the first amendment.
Where I think the problem lies with things like YouTube is the fact that we have _monopolies_, so there is no "free market" of platforms.
I think we should be addressing "big tech" censorship not by requiring tech companies to behave like a government, but rather by preventing any companies from having so much individual power that we _need_ them to behave like a government.
We should have aggressive anti-trust laws, and interoperability requirements for large platforms, such that it doesn't matter if YouTube decides to be censorious, because there are 15 other platforms that people can viably use instead.
Another way of articulating this: "concentrations of power and wealth should not determine the speech or political sentiments of the many."
My fear is that this is incredibly uncontroversial this is until it's not-- when pushes becomes shoves we start having debates about what are, "legitimate" concentrations of power (wealth) and how that legitimacy in itself lets us, "tolerate what we would generally condemn as intolerable." I feel we need to take a queue from the Chomsky's of the world and decree:
"all unjustified concentrations of power and wealth are necessarily interested in control and as such we should aggressively and purposefully refuse to tolerate them at all as a basic condition of democratic living..."
This used to be, "social democracy" where these days the Democratic Party in the United States' motto is more, "let us make deals with the devil because reasons and things." People have the power. We are the people. Hare fucking Krsna.
No one in Big Tech decides what you are allowed to say, they can only withhold their distribution of what you say.
As a book publisher, should I be required to publish your furry smut short stories? Of course not. Is that infringing on your freedom of speech? Of course not.
No, they ban your account and exclude you from the market commons if they don't like what you say.
15 replies →
If the furry smut people became the dominant force in literature and your company was driven out of business fairly for not producing enough furry smut would that too constitute censorship?
I want to see how steep this hill you're willing to die on is. What's that old saying-- that thing about the shoe being on the other foot?
This is just a reminder that we're both posting on one the most heavily censored, big tech-sponsored spaces on the internet, and arguably, that's what allows for you to have your civics debate in earnest.
What you are arguing for is a dissolution of HN and sites like it.
Does Disney have a positive obligation to show animal cruelty snuff films on Disney Plus? Or are they allowed to control what people say on their network? Does Roblox have to allow XXX games showing non-consensual sex acts on their site, or are they allowed to control what people say on their network? Can WebMD decide not to present articles claiming that homeopathy is the ultimate cure-all? Does X have to share a "trending" topic about the refusal to release the Epstein files?
The reason we ban government censorship is so that a private actor can always create their own conspiracy theory + snuff film site if they want, and other platforms are not obligated to carry content they find objectionable. Get really into Rumble or Truth Social or X if you would like a very different perspective from Youtube's.
Let's say that in the future that the dominant form of entertainment is X-rated animal snuff films for whatever reason. Would a lack of alternative content constitute an attack on your right to choose freely or speak? Given your ethical framework I'd have to say, "no" but even as your discursive opponent I would have to admit that if you as a person are adverse to, "X-rated furry smut" that I would sympathize with you as the oppressed if it meant your ability to live and communicate has been stifled or called into question. Oppression has many forms and many names. The Johnny Conservatarians want to reserve certain categories of cruelty as, "necessary" or, "permissable" by creating frameworks like, "everything is permitted just as long as some social condition is met..."
At the crux of things the libertarians and the non-psychos are just having a debate on when it's fair game to be unethical or cruel to others in the name of extending human freedom and human dignity. We've fallen so far from the tree.
I have some ideas I want to post on your personal webpage but you have not given me access. Why are you censoring me?
I have a consortium of other website owners who refuse to crosslink your materials unless you put our banner on your site. Is this oppression? Oppression goes both ways, has many names, and takes many forms. Its most insidious form being the Oxford Comma.
1 reply →
The government told me to.
Is andy99's personal webpage a de-facto commons where the public congregates to share and exchange ideas?
2 replies →
> a big tech company shouldn't decide what people are "allowed" to say
On their platform, that’s exactly what they are entitled to do. When you type into the box in the Facebook app, that’s your speech. But unless the platform wants to add your contribution to their coherent speech product, they have every right to reject it.
Otherwise, the government is deciding what people can say, and you’d be against that, right?
Further, if I wanted to start a social media platform called thinkingtylenolcausesautismisstupid.com, wouldn’t restricting my right to craft my product defeat the whole point of my business?
Giving platforms the ability to moderate their output to craft a coherent speech product is the only reason we have multiple social networks with different rules, instead of one first-mover social network with no rules where everyone is locked in by network effects.
> There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think
But even that dumb stuff aside: there's two ways for a government to silence the truth: censorship, and propaganda.
We've got LLMs now, letting interested parties (government or not) overwhelm everyone with an endless barrage of the worst, cheapest, lowest quality AI slop, the kind that makes even AI proponents like me go "ah, I see what you mean about it being autocomplete", because even the worst of that by quality is still able to bury any bad news story just as effectively as any censorship. Too much noise and not enough signal, is already why I'm consuming far less YouTube these days, why I gave up on Twitter when it was still called that, etc.
And we have AI that's a lot better at holding a conversation than just the worst, cheapest, lowest quality AI slop. We've already seen LLMs are able to induce psychosis in some people just by talking to them, and that was, so far as we can tell, accidental. How long will it be before a developer chooses to do this on purpose, and towards a goal of their choice? Even if it's just those who are susceptible, there's a lot of people.
What's important is the freedom to share truth, no matter how uncomfortable, and especially when it's uncomfortable for those with power. Unfortunately, what we humans actually share the most is gossip, which is already a poor proxy for truth and is basically how all the witch hunts, genocides, and other moral-panic-induced horrors of history happened.
It is all a mess; it is all hard; don't mistake the proxy (free speech in general) for the territory (speak truth to power, I think?); censorship is simultaneously bad and the only word I know for any act which may block propaganda which is also bad.
My refusing to distribute your work is not "silencing." Silencing would be me preventing you from distributing it.
Have we all lost the ability to reason? Seriously, this isn't hard. No one owes you distribution unless you have a contract saying otherwise.
It's not that simple. For example, when libraries remove books for political reasons they often claim it isn't "censorship" because you could buy the book at a bookstore if you wanted. But if it really would have no effect on availability they wouldn't bother to remove the book, would they?
Libraries are typically run by the government. Governments aren't supposed to censor speech. Private platforms are a different matter by law.
1 reply →
Not OP, but we did learn the US federal government was instructing social media sites like Twitter to remove content it found displeasing. This is known as jawboning and is against the law.
SCOTUS. Bantam Books, Inc. v. Sullivan, holds that governments cannot coerce private entities into censoring speech they disfavor, even if they do not issue direct legal orders.
This was a publicly announced motivation for Elon Musk buying Twitter. Because of which we know the extent of this illegal behavior.
Mark Zuckerberg has also publicly stated Meta was asked to remove content by the US government.
Crazy how fast we got from “please remove health misinformation during a pandemic” (bad) to “FCC chair says government will revoke broadcast licenses for showing comedians mocking the president” (arguably considerably worse).
10 replies →
I think the feeling of silencing comes from it being a blacklist and not a whitelist.
If you take proposals from whoever and then only approve ones you specifically like, for whatever reason, then I don’t think anyone would feel silenced by that.
If you take anything from anyone, and a huge volume of it, on any topic and you don’t care what, except for a few politically controversial areas, that feels more like silencing. Especially when there is no alternative service available due to network effects and subsidies from arguably monopolistic practices.
Also allowing it to be posted initially for a period of time before being taken down feels worse than simply preventing it from ever being published on your platform to begin with.
Of course they would never check things before allowing them to be posted because there isn’t any profit in that.
I'd certainly consider an ISP refusing to route my packets as silencing. is YouTube so different? legally, sure, but practically?
If we were still in the age of personal blogs and phpbb forums, where there were thousands of different venues - the fact the chess forum would ban you for discussing checkers was no problem at all.
But these days, when you can count the forums on one hand even if you're missing a few fingers, and they all have extremely similar (American-style) censorship policies? To me it's less clear than it once was.
No because you are perfectly technically capable of setting your own servers in a colo and distributing your video.
yes... coz youtube is not your ISP. A literal massive difference. RE: net neutrality.
At some level these platforms are the public square and facilitate public discussion. In fact, Google has explicitly deprioritized public forum sites (e.g. PHPbb) in preference to forums like YouTube. Surely there is a difference between declining to host and distribute adult material and enforcing a preferred viewpoint on a current topic.
Sure, Google doesn't need to host anything they don't want to; make it all Nazi apologia if they thing it serves their shareholders. But doing so and silencing all other viewpoints in that particular medium is surely not a net benefit for society, independent of how it affects Google.
“Covid” related search results were definitely hard-coded or given a hand-tuned boost. Wikipedia was landing on the 2nd or 3rd page which never happens for a general search term on Google.
I’d even search for “coronavirus” and primarily get “official” sites about Covid-19 even tho that’s just one of many coronaviruses. At least Wikipedia makes the front page again, with the Covid-19 page outranking the coronavirus page…
“Covid” related search results were definitely hard-coded. Wikipedia was landing on the 2nd or 3rd page which never happens.
I’d even search for “coronavirus” and primarily get “official” sites about Covid-19 even tho that’s just one of many coronaviruses. At least Wikipedia makes the front page again, with the Covid-19 page outranking the coronavirus page…
> My refusing to distribute your work is not "silencing."
That distinction is a relic of a world of truly public spaces used for communication— a literal town square. Then it became the malls and shopping centers, then the Internet— which runs on private pipes— and now it’s technological walled gardens. Being excluded from a walled garden now is effectively being “silenced” the same way being excluded from the town square was when whatever case law you’re thinking was decided.
> No one owes you distribution unless you have a contract saying otherwise.
The common carrier law says you have to for for some things, so it makes sense to institute such a law for some parts of social media as they are fundamental enough. It is insane that we give that much censorship power to private corporations. They shouldn't have the power to decide elections on a whim etc.
I 100% agree with your sentiment here Jensson but in Googling, "common carrier law" what I get are the sets of laws governing transportation services liability:
https://en.wikipedia.org/wiki/Common_carrier
Is there perhaps another name for what you're describing? It piques my interest.
9 replies →
It's interesting how much "they are a private company, they can do what they want" was the talking point around that time. And then Musk bought Twitter and people accuse him of using it to swing the election or whatever.
Even today, I was listening to NPR talk about the potential TikTok deal and the commenter was wringing their hands about having a "rich guy" like Larry Ellison control the content.
I don't know exactly what the right answer is. But given their reach -- and the fact that a lot of these companies are near monopolies -- I think we should at least do more than just shrug and say, "they can do what they want."
If you refuse to distribute some information you are making editorial decision. Clearly you are reviewing all of the content. So you should be fully liable for all content that remains. Including things like libel or copyright violation.
To me that sounds only fair trade. You editorialize content. You are liable for all content. In every possible way.
Jimmy Kimmel wasn't being silenced. He doesn't have a right to a late night talk show. Disney is free to end that agreement within the bounds of their contract. Being fired for social media posts isn't being silenced. Employment is for the most part at will. Getting deported for protesting the Gaza war isn't being silenced. Visas come with limitations, and the US government has the authority to revoke your visa if you break those rules. /s
You seem to think there's a bright line of "silenced" vs "not silenced". In reality there's many ways of limiting and restricting people's expressions. Some are generally considered acceptable and some are not. When huge swaths of communication are controlled by a handful of companies, their decisions have a huge impact on what speech gets suppressed. We should interrogate whether that serves the public interest.
The US has pretty much given up on antitrust enforcement. That's the big problem.
The federal government was literally pressuring ABC to take Kimmel off the air. Even Ted Cruz and other prominent republicans said that was a bridge too far.
5 replies →
[dead]
So you're saying that YouTube is a publisher and should not have section 230 protections? They can't have it both ways. Sure remove content that violates policies but YouTube has long set itself up as an opinion police force, choosing which ideas can be published and monetized and which cannot.
https://www.techdirt.com/2020/06/23/hello-youve-been-referre...
1 reply →
Section 230 does not work like you think it does. In fact it is almost opposite of what you probably think it does. The whole point was to allow them to have it both ways.
It makes sites not count as the publisher or speaker of third party content posted to their site, even if they remove or moderate that third party content.
YouTube’s business model probably wouldn’t work if they were made to be responsible for all the content they broadcasted. It would be really interesting to see a world where social media companies were treated as publishers.
Might be a boon for federated services—smaller servers, finer-grained units of responsibility…
I agree. People today are far more anti-vaccine than they were a few years ago which is kinda crazy when you consider we went through a global pandemic where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.
I think if public health bodies just laid out the data they had honestly (good and bad) and said that they think most people should probably take it, but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.
And the attempts at censorship have played a part in people drifting towards being more vaccine-hesitant or anti-vaccine.
It's often a lot better to just let kooks speak freely.
> It's often a lot better to just let kooks speak freely.
They have always been able to speak freely. I still see vaccine conspiracies on HN to this day. It was rampant during COVID as well.
It's less about censorship and more about more people becoming middle-class and therefore thinking they're smarter than researchers.
There is nobody more confident in themselves than the middle-class.
4 replies →
That didn't happen in a vacuum; there was also a _lot_ of money going into pushing anti vaccine propaganda, both for mundane scam reasons and for political reasons: https://x.com/robert_zubrin/status/1863572439084699918?lang=...
>where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.
The only reason you believe that is because all information to the contrary was systematically censored and removed from the media you consume. The actual data doesn't support that, there are even cases where it increased mortality, like https://pmc.ncbi.nlm.nih.gov/articles/PMC11278956/ and increased the chance of future covid infections, like https://pubmed.ncbi.nlm.nih.gov/39803093/ .
It isn't hard to find that randomized controlled trials and large meta-analyses show that COVID vaccines are highly effective. No need to rely on media. You can point to one or two observational re-analyses that show otherwise but overall they are not particularly convincing given the large body of easily accessible other evidence.
2 replies →
I appreciate you.
People have become more anti-Vax because the Covid vaccines were at best ineffective and as you said anything contra-narrative is buried or ignored.
If you push a shitty product and force people to take it to keep their jobs it’s going to turn them into skeptics of all vaccines, even the very effective ones.
More harm than good was done there. The government should have approved them for voluntary use so the fallout would not have been so bad.
3 replies →
This is typical of Covid conspiracy theorists, or conspiracy theorists of any sort: one or two papers on one side prove something, but an overwhelming mountain of evidence on the other side does not prove something. The theorist makes no explanation as to how a planetful of scientists missed the obvious truth that some random dudes found; they just assert that it happened, or make some hand-waving explanation about how an inexplicable planet-wide force of censors is silencing the few unremarkable randos who somehow have the truth.
The first paper seems to claim a very standard cohort study is subject to "immortal time bias", an effect whereby measuring outcomes can seem to change them. The typical example of sampling time bias is that slow-growing cancers are more survivable than fast-growing ones, but also more likely to be measured by a screening, giving a correlation between screening and survivablility. So you get a time effect where more fast-acting cancers do not end up in the measurement, biasing the data.
But in measurements such that one outcome or the other does not bias the odds of that outcome being sampled, there can be no measurement time effect, which is why it's not corrected for in studies like this. The authors do not explain why measurement time effects would have anything to do with detecting or not detecting death rates in the abstract, or anywhere else in the paper, because they are quacks, who apply arbitrary math to get the outcome they want.
As another commenter pointed out, randomized controlled trials -- which cannot possibly have this made-up time effect -- often clearly show a strongly positive effect for vaccination.
I did not read the second paper.
1 reply →
Please stop posting conspiracy theory garbage.
A sibling read the your first link and noted the problems with it. I read just the abstract of the second link, and it's clear their methodology and description of what they're measuring can't actually support their conclusions.
> but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.
Nah, the same grifters who stand to make a political profit of turning everything into a wedge issue would have still hammered right into it. They've completely taken over public discourse on a wide range of subjects, that go well beyond COVID vaccines.
As long as you can make a dollar by telling people that their (and your) ignorance is worth just as much - or more - than someone else's knowledge, you'll find no shortage of listeners for your sermon. And that popularity will build its own social proof. (Millions of fools can't all be wrong, after all.)
I agree. Again the vast majority would have gotten the vaccine.
There's always going to be people for all kinds of reasons pushing out bad ideas. That's part of the trade-off of living in a free society where there is no universal "right" opinion the public must hold.
> They've completely taken over public discourse on a wide range of subjects
Most people are not anti-vax. If "they've" "taken over public discourse" in other subjects to the point you are now holding a minority opinion you should consider whether "they" are right or wrong and why so many people believe what they do.
If can't understand their position and disagree you should reach out to people in a non-confrontational way, understand their position, then explain why you disagree (if you still do at that point). If we all do a better job at this we'll converge towards truth. If you think talking and debate isn't the solution to disagreements I'd argue you don't really believe in our democratic system (which isn't a judgement).
1 reply →
It's more that people in general* connect to personal stories far more than impersonal factual data. It's easy to connect to seeing people say they had adverse reactions to a vaccine than statistical data showing it's safer to get vaccinated than not. It's also easier to believe conspiracies, its easier to think bad things happen due to the intent of bad people, than the world being a complex hard to understand place with no intent behind things happening.
These are just things that some of the population will be more attracted to, I don't think it has anything to do with censorship, lockdowns, or mandates. At most the blame can be at institutions for lacking in their ability to do effective scientific communication.
*And this skews more to less educated and intelligent.
Issue is when we weren't/aren't even allowed to question the efficacy or long-term side effects of any vaccine.
> one of the only things that actually worked to stop people dying was the roll out of effective vaccines
"A total of 913 participants were included in the final analysis. The adjusted ORs for COVID-19 infection among vaccinated individuals compared to unvaccinated individuals were 1.85 (95% CI: 1.33-2.57, p < 0.001). The odds of contracting COVID-19 increased with the number of vaccine doses: one to two doses (OR: 1.63, 95% CI: 1.08-2.46, p = 0.020), three to four doses (OR: 2.04, 95% CI: 1.35-3.08, p = 0.001), and five to seven doses (OR: 2.21, 95% CI: 1.07-4.56, p = 0.033)." - ["Behavioral and Health Outcomes of mRNA COVID-19 Vaccination: A Case-Control Study in Japanese Small and Medium-Sized Enterprises" (2024)](https://www.cureus.com/articles/313843-behavioral-and-health...)
"the bivalent-vaccinated group had a slightly but statistically significantly higher infection rate than the unvaccinated group in the statewide category and the age ≥50 years category" - ["COVID-19 Infection Rates in Vaccinated and Unvaccinated Inmates: A Retrospective Cohort Study" (2023)](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10482361/)
"The risk of COVID-19 also varied by the number of COVID-19 vaccine doses previously received. The higher the number of vaccines previously received, the higher the risk of contracting COVID-19" - ["Effectiveness of the Coronavirus Disease 2019 (COVID-19) Bivalent Vaccine" (2022)](https://www.medrxiv.org/content/10.1101/2022.12.17.22283625v...)
"Confirmed infection rates increased according to time elapsed since the last immunity-conferring event in all cohorts. For unvaccinated previously infected individuals they increased from 10.5 per 100,000 risk-days for those previously infected 4-6 months ago to 30.2 for those previously infected over a year ago. For individuals receiving a single dose following prior infection they increased from 3.7 per 100,000 person days among those vaccinated in the past two months to 11.6 for those vaccinated over 6 months ago. For vaccinated previously uninfected individuals the rate per 100,000 person days increased from 21.1 for persons vaccinated within the first two months to 88.9 for those vaccinated more than 6 months ago." - ["Protection and waning of natural and hybrid COVID-19 immunity" (2021)](https://www.medrxiv.org/content/10.1101/2021.12.04.21267114v...)
[flagged]
If that were the case, wouldn’t we see vaccine skepticism in poorly educated, racist non-Western nations?
3 replies →
> I think the anti-vax thing is mostly because the average Western education level is just abysmal.
What does the West have to do with it? Non-westerners are even more into folk medicine and witch doctors.
3 replies →
Anti-vax has never really been a thing though. I don't know what the data is these days, but it used to be like 1% of the population who were anti-vax.
We have the same thing going on with racism in the West where people are convinced racism is a much bigger problem than it actually is.
And whether it's anti-vax or racist beliefs, when you start attacking people for holding these views you always end up inadvertently encouraging people to start asking why that is and they end up down rabbit holes.
No one believes peas cause cancer for example, but I guarantee one of best ways to make people start to believing peas cause cancer is for the media to start talking about how some people believe that peas do cause cancer, then for sites like YouTube and Facebook to starting ban people who talk about it. Because if they allow people to talk about UFOs and flat Earth conspiracies why are they banning people for suggesting that peas cause cancer? Is there some kind of conspiracy going on funded by big agriculture? You can see how this type of thinking happens.
1 reply →
The anti-vax thing is because every single comparative study of vaccinated and unvaccinated children found a greater rate of developmental disorders in vaccinated children. They're also the only products for which you're not allowed to sue the manufacturers for liability, and the justification given by the manufacturers for requesting this liability protection was literally that they'd be sued out of business otherwise. If they were as safe as other treatments they wouldn't need a blanket liability immunity.
Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186
Anthony R. Mawson et al., “Preterm Birth, Vaccination and Neurodevelopmental Disorders: A Cross-Sectional Study of 6- to 12-Year-Old Vaccinated and Unvaccinated Children,” Journal of Translational Science 3, no. 3 (2017): 1-8, doi:10.15761/JTS.1000187.
Brian Hooker and Neil Z. Miller, “Analysis of Health Outcomes in Vaccinated and Unvaccinated Children: Developmental Delays, Asthma, Ear Infections and Gastrointestinal Disorders,” SAGE Open Medicine 8, (2020): 2050312120925344, doi:10.1177/2050312120925344.
Brian Hooker and Neil Z. Miller, “Health Effects in Vaccinated versus Unvaccinated Children,” Journal of Translational Science 7, (2021): 1-11, doi:10.15761/JTS.1000459.
James Lyons-Weiler and Paul Thomas, “Relative Incidence of Office Visits and Cumulative Rates of Billed Diagnoses along the Axis of Vaccination,” International Journal of Environmental Research and Public Health 17, no. 22 (2020): 8674, doi:10.3390/ijerph17228674.
James Lyons-Weiler, "Revisiting Excess Diagnoses of Illnesses and Conditions in Children Whose Parents Provided Informed Permission to Vaccinate Them" September 2022 International Journal of Vaccine Theory Practice and Research 2(2):603-618 DOI:10.56098/ijvtpr.v2i2.59
NVKP, “Diseases and Vaccines: NVKP Survey Results,” Nederlandse Vereniging Kritisch Prikken, 2006, accessed July 1, 2022.
Joy Garner, “Statistical Evaluation of Health Outcomes in the Unvaccinated: Full Report,” The Control Group: Pilot Survey of Unvaccinated Americans, November 19, 2020.
Joy Garner, “Health versus Disorder, Disease, and Death: Unvaccinated Persons Are Incommensurably Healthier than Vaccinated,” International Journal of Vaccine Theory, Practice and Research 2, no. 2, (2022): 670-686, doi: 10.56098/ijvtpr.v2i2.40.
Rachel Enriquez et al., “The Relationship Between Vaccine Refusal and Self-Report of Atopic Disease in Children,” The Journal of Allergy and Clinical Immunology 115, no. 4 (2005): 737-744, doi:10.1016/j.jaci.2004.12.1128.
8 replies →
Yes! This MUST be why the VAERS adverse event tracker went through the roof right after the rollout began, and why excess death remains sky high in many countries to this day - because a product that didn't stop you from catching or spreading the virus was one of the only things preventing deaths. Couldn't have been our, you know, immune system or anything like that, or that the average age at death was 80 along with several co-morbidities.
I feel like we're living in different worlds, because from what I've seen, giving people platforms clearly doesn't work either. It just lets the most stupid and incendiary ideas to spread unchecked.
If you allow crazy people to "let it ride" then they don't stop until... until... hell we're still in the middle of it and I don't even know when or if they will stop.
Spot on. At least in the UK, anyone who thinks fake news will just "fizzle out" on social media hasn't been paying attention to the increasing frenzy being whipped up by the alt right on Twitter and Telegram, and consequences like the Southport riots.
It's poorly thought out logic. Everyone sees how messy and how mistakes can be made when attempting to get to a truth backed by data + science, so they somehow they conclude that allowing misinformation to flourish will solve the problem instead of leading to a slow decline of morality/civilization.
Very analogous to people who don't like how inefficient governments function and somehow conclude that the solution is to put people in power with zero experience managing government.
There's a journey that every hypothesis makes on the route to becoming "information", and that journey doesn't start at top-down official recognition. Ideas have to circulate, get evaluated and rejected and accepted by different groups, and eventually grasp their way towards consensus.
I don't believe Trump's or Kennedy's ideas about COVID and medicine are the ones that deserve to win out, but I do think that top-down suppression of ideas can be very harmful to truth seeking and was harmful during the pandemic. In North America I believe this led to a delayed (and ultimately minimal) social adoption of masks, a late acceptance of the aerosol-spread vector, an over-emphasis on hand washing, and a far-too-late restriction on international travel and mass public events, well past the point when it could have contributed to containing the disease (vs Taiwan's much more effective management, for example).
Of course there's no guarantee that those ideas would have been accepted in time to matter had there been a freer market for views, and of course it would have opened the door to more incorrect ideas as well, but I'm of the view that it would have helped.
More importantly I think those heavy restrictions on pre-consensus ideas (as many of them would later become consensus) helped lead to a broader undermining of trust in institutions, the fallout of which we are observing today.
3 replies →
I wonder how much of that is giving a platform to conspiracy theorists and how much of it is the social media algorithms' manipulation making the conspiracy theories significantly more visible and persuasive.
Look at 4chan and it's derivatives: minimal algorithms and they're the shitholes of ideas on the internet.
Is there any consideration of this with regard to Section 230? e.g., you're a passive conduit if you allow something to go online, but you're an active publisher if you actively employ any form of algorithm to publish and promote?
Perhaps free speech isn't the problem, but free speech x algorithmic feeds is? As we all know the algorithm favors the dramatic, controversial, etc. That creates an uneven marketplace for free speech where the most subversive and contrarian takes essentially have a megaphone over everyone else.
As I understand it, Twitter has something called Community Notes. So people can write things, but it can potentially have an attached refutation.
Community notes is better than nothing, but they only relate to a single tweet. So if one tweet with misinformation gets 100k likes, then a community note might show up correcting it.
But if 100 tweets each get 1000 likes, they're never singularly important enough to community note.
3 replies →
Glad to see this, was going to make a similar comment.
People should be free to say what they want online. But going down "YouTube conspiracy theory" rabbit holes is a real thing, and YouTube doesn't need to make that any easier, or recommend extreme (or demonstrably false) content because it leads to more "engagement".
Online, sure. But online doesn't mean YouTube or Facebook.
Building on that, the crazy person spouting conspiracy theories in the town square, who would have been largely ignored in the past, suddenly becomes the most visible.
The first amendment was written in the 1700s...
I feel that this is the right approach-- the liability and toxicity of the platforms isn't due to them being communication platforms it's because in most practical or technical ways they are not: they are deliberate behavior modification schemes where-in companies are willfully inflaming their customer's political and social sentiments for profit in exchange for access to the addictive platform. It's like free digital weed but the catch is that it makes you angry and politically divisive.
In this sense platforms like X need to be regulated more like gambling. In some ways X is a big roulette wheel that's being spun which will help stochastically determine where the next major school shooting will take place.
Right, engagement algorithms are like giving bad takes a rocket ship.
The words of world renown epidemiologists who were, to be frank, boring and unentertaining could never possibly compete with crunchymom44628 yelling about how Chinese food causes covid.
Bad takes have the advantage of the engagement of both the people who vehemently agree and the people who vehemently disagree. Everyone is incentivized to be a shock jock. And the shock jocks are then molded by the algorithm to be ever more shock jockish.
Especially at a time when we were all thrown out of the streets and into our homes and online.
And here I'll end this by suggesting everyone watch Eddington.
1 reply →
I think it made sense as a tactical choice at the moment, just like censorship during wartime - I dont think it should go on forever, because doing so is incompatible with a free society.
It didn't even make sense at the time. It tainted everything under a cloud that the official, accepted truth needed to suppress alternatives to win the battle of minds. It was disastrous, and it is astonishing seeing people (not you, but in these comments) still trying to paint it as a good choice.
It massively amplified the nuts. It brought it to the mainstream.
I'm a bit amazed seeing people still justifying it after all we've learned.
COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.
And to state my position like the root guy, I'm a progressive, pro-vaccine, medical science believer. I listen to my doctor and am skeptical if not dismissive of the YouTube "wellness" grifters selling scam supplements. I believe in science and research. I thought the worm pill people were sad if not pathetic. Anyone who gets triggered by someone wearing a mask needs to reassess their entire life.
But lockdowns went on way too long. Limits on behaviour went on way too long. Vaccine compliance measures were destructive the moment we knew it had a negligible effect on spread. When platforms of "good intentions" people started silencing the imbeciles, it handed them a megaphone and made the problem much worse.
And now we're living in the consequences. Where we have a worm-addled halfwit directed medicine for his child-rapist pal.
>It massively amplified the nuts. It brought it to the mainstream.
>COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.
In theory, I agree, kind of.
But also - we were 10+ months into COVID raging in the US before Biden’s administration, the administration that enacted the policies the article is about, came to be. Vaccine production and approval were well under way, brought to fruition in part due to the first Trump administration. The “nuts” had long been mainstream and amplified before this “silencing” began. Misinformation was rampant and people were spreading it at a quick speed. Most people I know who ultimately refused the vaccines made up their minds before Biden took office.
3 replies →
[dead]
IMO free speech requires moderation, but the "how" is an unsolved problem. In a completely unmoderated environment, free speech will be drowned out by propaganda from your adversaries. The decades of experience and the industrial scale that Russian (or similar) troll factories can manufacture grassroots content or fund influencers is not something that can be combated at an individual level.
It would be a mistake to think such operations care too much about specific talking points, the goal is to drown out moderate discussion to replace it with flamewars. It's a numbers game, so they'll push in hundreds of different directions until they find something that sticks and also both sides of the same conflict.
The problem is the algorithm.
Content that makes people angry (extreme views) brings views.
Algorithims optimise for views -> people get recommended extreme views.
You can test this with a fresh account, it doesn't take many swipes on Youtube Shorts to get some pretty heinous shit if you pretend to be a young male to the algorithm.
The best disinfectant is sunlight. I'm similarly appalled by some of the behaviour after a certain political activist was murdered, but I don't want them to get banned or deplatformed. I'm hoping what we're seeing here is a restoration of the ability to disagree with each other
>The best disinfectant is sunlight.
Is it? How does that work at scale?
Speech generally hasn’t been restricted broadly. The same concepts and ideas removed from YouTube still were available on many places (including here).
Yet we still have so many people believing falsehoods and outright lies. Even on this very topic of COVID, both sides present their “evidence” and and truly believe they are right, no matter what the other person says.
What's your alternative? The opposite is state dictated censorship and secrecy and those have turned very wrong every single time.
2 replies →
> The best disinfectant is sunlight
Have you actually tried to shine sunlight on online misinformation? If you do you will quickly find it doesn't really work.
The problem is simple. It is slower to produce factually correct content. A lot slower. And when you do produce something the people producing the misinformation can quickly change their arguments.
Also, by the time you get your argument out many of the people who saw the piece you are refuting and believed it won't even see your argument. They've moved on to other topics and aren't going to revisit that old one unless it is a topic they are particularly interested in. A large number will have noted the original misinformation, such as some totally unsafe quack cure for some illness that they don't currently have, accepted it as true, and then if they ever find themselves with that illness apply the quack cure without any further thought.
The debunkers used to have a chance. The scammers and bullshitters always had the speed advantage when it came to producing content but widespread distribution used to be slow and expensive. If say a quack medical cure was spreading the mainstream press could ask the CDC or FDA about it, talk to researchers, and talk to doctors dealing with people showing up in emergency rooms from trying the quack cure, and they had the distribution networks to spread this information out much faster than the scammers and bullshitters.
Now everyone has fast and cheap distribution through social media, and a large number of people only get their information from social media and so the bullshitters and scammers now have all the advantages.
And not letting the disease spread to begin with is better than any disinfectant.
>> The best disinfectant is sunlight.
Trump thought so too.
How's that working out? The worst ideas of the 20th century are resurfacing in plain sunlight because the dem's couldn't pluck their heads out of the sand and actually fight them.
Now vaccines are getting banned and the GOP is gerrymandering the hell out of the country to ensure the end of the democratic process. Sure, let's do nothing and see where that brings us. Maybe people will magically come to their senses.
Well, people literally died. So, I think we all know how it played out.
The same thing since time eternal will continue to occur: the educated and able will physically move themselves from risk and others will suffer either by their own volition, or by association, or by lot.
I agree and I’m pro vaccines but want the choice on if/when to vaccinate my kids. I believe there were election discrepancies but not sure if it was stolen. I felt the ZeroHedge article about lab leak was a reasonable possibility. All these things were shutdown by the powers that be (and this was not Trump’s fault). The people shutting down discourse are the problem.
You pretty much have the choice about vaccinating your kids. You might not be able to send them to public school without vaccinations though, depending on your local laws.
In California, it is required for public schools and many private schools also require it, so effectively it isn't much of a choice.
I think there's a difference between silencing people, and having an algorithm that railroads people down a polarization hole.
My biggest problem with YouTube isn't what it does/doesn't allow on its platform, it's that it will happily feed you a psychotic worldview if it keeps you on the site. I've had several family members go full conspiracy nut-job after engaging heavily with YouTube content.
I don't know what the answer is. I think many people would rightly argue that removing misinformation from the recommendation engine is synonymous with banning it. FWIW I'd be happy if recommendation engines generally were banned for being a societal cancer, but I'm probably in the minority here.
They don't silence people to stop narratives. People are silenced to cause divisions and to exert control over the population. When people stop using tech they don't control and supporting people or systems that do not have their best interests a heart, only then will we see reach change.
There is no conspiracy. It’s all emergent behavior by large groups of uncoordinated dunces who can’t keep even the most basic of secrets.
Its known strategy that happens all the time by corrupt individuals in governments around the world. Its so pervasive, im not going to even to bother posting links.
I used to think like you, believing that, on average, society would expurge the craziness, but the last decade and the effect of social media and the echo chambers in groups made me see that I was completely wrong.
It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.
If 'silencing people' doesn't work- so online platforms aren't allowed to remove anything? Is there any limit to this philosophy? So you think platforms can't remove:
Holocaust denial? Clothed underage content? Reddit banned r/jailbait, but you think that's impermissible? How about clothed pictures of toddlers but presented in a sexual context? It would be 'silencing' if a platform wanted to remove that from their private property? Bomb or weapons-making tutorials? Dangerous fads that idiotic kids pass around on TikTok, like the blackout game? You're saying it's not permissible for a platform to remove dangerous instructionals specifically targeted at children? How about spam? Commercial advertising is legally speech in the US. Platforms can't remove the gigantic quantities of spam they suffer from every day?
Where's the limiting principle here? Why don't we just allow companies to set their own rules on their own private property, wouldn't that be a lot simpler?
I used to believe this. But I feel more and more we need to promote a culture of free speech that goes beyond the literal first amendment. We have to tolerate weird and dangerous ideas.
Better out in the open with refutations or warnings than in the dark where concepts become physical dangers.
2 replies →
> It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.
1) They are public corporations and are legal creation of the state and benefit from certain protections of the state. They also have privileged access to some public infrastructures that other private companies do not have.
2) By acting on the behest of the government they were agent of the government for free speech and censorship purposes
3) Being monopolies in their respective markets, this means they must respect certain obligations the same way public utilities have.
Re: 1- one certain protection of the state that they benefit from is the US Constitution, which as interpreted so far forbids the government to impair their free speech rights. Making a private actor host content they personally disagree with violates their right of free speech! That's what the 1st Amendment is all about
2. This has already been adjudicated and this argument lost https://en.wikipedia.org/wiki/Murthy_v._Missouri
3. What market is Youtube a monopoly in?
2 replies →
Read the article, along with this one https://reclaimthenet.org/google-admits-biden-white-house-pr...
In this case it wasn't a purely private decision.
"Where's the limiting principle here?"
How about "If the content isn't illegal, then the government shouldn't pressure private companies to censor/filter/ban ideas/speech"?
And yes, this should apply to everything from criticizing vaccines, denying election results, being woke, being not woke, or making fun of the President on a talk show.
Not saying every platform needs to become like 4chan, but if one wants to be, the feds shouldn't interfere.
>A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.
Except many people don't roll their eyes at it, that's exactly the problem. QAnon went from a meme on 4chan to the dominant political movement across the US and Europe. Anti-vax went from fringe to the official policy position of the American government. Every single conspiracy theory that I'm aware of has only become more mainstream, while trust in any "mainstream" source of truth has gone down. All all of this in an environment of aggressive skepticism, arguing, debating and debunking. All of the sunlight is not disinfecting anything.
We're literally seeing the result of the firehose of misinformation and right-wing speech eating people's brains and you're saying we just have to "let it ride?"
Silencing people alone doesn't work, but limiting the damage misinformation and hate speech can do while pushing back against it does work. We absolutely do need to preserve the right of platforms to choose what speech they spread and what they don't.
> But I think we have to realize silencing people doesn't work
It actually does work. You need to remove ways for misinformation to spread, and suppressing a couple of big agents works very well.
- https://www.nature.com/articles/s41586-024-07524-8 - https://www.tandfonline.com/doi/full/10.1080/1369118X.2021.1... - https://dl.acm.org/doi/abs/10.1145/3479525 - https://arxiv.org/pdf/2212.11864
Obviously, the best solution would be prevention, by having good education systems and arming common people with the weapons to assess and criticize information, but we are kinda weak on that front.
> But I think we have to realize silencing people doesn't work.
Doesn't it though? I've seen this repeated like it's fact but I don't think that's true. If you disallowed all of some random chosen conspiracy off of YouTube and other mainstream platforms I think it would stop being part of the larger public consciousness pretty quickly.
Many of these things arrived out of nothing and can disappear just as easily.
It's basic human nature that simply hearing things repeated over and over embeds it into your consciousness. If you're not careful and aware of what you're consuming then that becomes a part of your world view. The most effective way to bring people back from conspiratorial thinking (like QAnon) is to unplug them from that source of information.
The issue for me is that kids are on YouTube, and I think there should be some degree of moderation.
> But I think we have to realize silencing people doesn't work.
We also tried letting the propaganda machine full-blast those lies on the telly for the past 5 years.
For some reason, that didn't work either.
What is going to work? And what is your plan for getting us to that point?
Algorithmic Accountability.
People can post all sorts of crazy stuff, but the algorithms do not need to promote it.
Countries can require Algorithmic Impact Assements and set standards of compliance to recommended guidelines.
This seems unlikely to be constitutional in the US.
These policies were put in place because the anti-vax and election skepticism content was being promoted by military intelligence organizations that were trying to undermine democracy and public healthy in the US.
The US military also promoted anti-vax propaganda in the Philippines [0].
A lot of the comments here raise good points about silencing well meaning people expressing their opinion.
But information warfare is a fundamental part of modern warfare. And it's effective.
An American company or individual committing fraud can be dealt with in the court system. But we don't yet have a good remedy for how to deal with a military power flooding social media with information intentionally designed to mislead and cause harm for people who take it seriously.
So
> I think we have to realize silencing people doesn't work
it seems to have been reasonably effective at combating disinformation networks
> It just causes the ideas to metastasize
I don't think this is generally true. If you look at old disinformation campaigns like the idea that the US faked the moon landings, it's mostly confined to a small group of people who are prone to conspiracy thinking. The idea of a disinformation campaign is you make it appear like a crazy idea has broad support. Making it appear that way requires fake accounts or at least boosters who are in on the scheme. Taking that away means the ideas compete on their own merit, and the ideas are typically real stinkers.
[0] https://www.btimesonline.com/articles/167919/20240727/u-s-ad...
I think you are granting false neutrality to this speech. These misinfo folks are always selling a cure to go with their rejection of medicine. It's a billion dollar industry built off of spreading fear and ignorance, and youtube doesn't have any obligation to host their content. As an example, for 'curing' autism, the new grift is reject Tylenol and buy my folic acid supplement to 'fix' your child. Their stores are already open and ready.
To finish the thought, scientists at the CDC (in the before times) were not making money off of their recommendations, nor were they making youtube videos as a part of their day job. There's a deep asymmetry here that's difficult to balance if you assume the premise that 'youtube must accept every kind of video no matter what, people will sort themselves out'. Reader, they will not.
And silencing these people only lends credence to their "they don't want you to know this" conspiracy theories. Because at that point it's not a theory, it's a proven fact.
These people will claim they were 'silenced' regardless. Even as they appear with their published bestseller about being silenced on every podcast and news broadcast under the sun, they will speak of the 'conspiracy' working against them at every step. The actual facts at hand almost never matter. Even at a press conference where the President is speaking on your behalf they'll speak of the 'groups' that are 'against' them, full of nefarious purpose. There is no magical set of actions that changes the incentive they have to lie, or believe lies. (except regulation of snake oil, which is not going to happen any time soon)
2 replies →
It works 99% of the time and you are overindexing on the 1% of the time it doesn’t to draw your conclusion.
Silencing people is the only thing that works is what I’ve learned on the internet.
Yes we should be allowed to bully idiots into the ground.
no, letting misinformation persist is counterproductive because of the illusory truth effect. the more people hear it, the more they think (consciously or not) "there must be something to this if it keeps popping up"
Elon Musk's takeover of X is already a good example of what happens with unlimited free speech and unlimited reach.
Neo-nazis and white nationalists went from their 3-4 replies per thread forums, 4chan posts, and Telegram channels, to now regularly reaching millions of people and getting tens of thousands of likes.
As a Danish person I remember how American media in the 2010s and early 2020s used to shame Denmark for being very right-wing on immigration. The average US immigration politics thread on X is worse than anything I have ever seen in Danish political discussions.
I wish someone could have seen the eye roll I just performed reading this comment.
Silencing absolutely works! How do you think disinformation metastasized!?
Funny thing, several person that counter responded and disagreed got grayed out (aka negative downvoted ... as in censored).
Reality is, i have personally seen what this type of uncontrolled anti-vax stuff does. The issue is that its harder to disprove a negative with a positive, then people realize.
The moment you are into the youtube, tiktok or whatever platform algorithm, you are fed a steady diet of this misinformation. When you then try to argue with actual factual studies, you get the typical response from "they already said that those studies are made up"... How do you fight that? Propaganda works by flooding the news and over time, people believe it.
That is the result of uncensored access because most people do not have the time to really look up a scientific study. The amount of negative channels massive out way positive / fact based channels because the later is "boring". Its the same reason why your evening news is 80% deaths, corruptions, thefts, politicians and taxes or other negative world news. Because it has been proven that people take in negative news much more. Clickbait titles that are negative draw in people.
There is a reason why holocaust denial is illegal in countries. Because the longer some people can spew that, the more people actually start to believe it.
Yes, i am going to get roasted for this but people are easily influenced and they are not as smart as they think themselves are. We have platforms that cater to people's short attention span with barely 1~3 min clips. Youtube video's longer then 30min are horrible for the youtubers income as people simply do not have the attention span and resulting lost income.
Why do we have laws like seatbelt, speed limits, and other "control" over people. Because people left to their own devices, can be extreme uncaring about their own family, others, even themselves.
Do i like the idea of censorship for the greater good, no. But when there are so many that spew nonsense just to sell their powders, and their homemade vitamine C solutions (made in China)... telling people information that may hurt or kills themselves, family or others.
Where is the line of that unbridled free speech? Silencing people works as in, your delaying the flow of shit running down a creek, will it stop completely? No, but the delay helps people downstream. Letting it run uninterrupted, hoping that a few people downstream with a mop will do all the work, yea ...
We only need to look at platforms like X, when "censorship" got removed (moderation). Full of free speech, no limits and it turned into a soak pit extreme fast (well, bigger soak pit).
Not sure why i am writing this because this is a heated topic but all i can say is, I have seen the damage that anti-vax did on my family. And even to this day, that damage is still present. How a person who never had a issue with vaccinations, never had a bad reaction beyond the sour arm for a day, turned so skeptical to everything vaccination. All because those anti-vax channels got to her.
The anti-vax movement killed people. There is scientific study upon study how red states in the US ended up with bigger amounts of deaths given the time periodes. And yet, not a single person was ever charged for this, ... all simply accepted this and never looked back. Like it was a natural thing, that people's grandparents, family members died that did not need to die.
The fact that people have given up, and now accept to let those with often financial interests, spew nonsense as much as they like. Well, its "normal".
I weep for the human race because we are not going to make it.
> silencing people doesn't work
I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?
Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?
Slow down our algorithmic hell hole. Particularly around elections.
>Slow down our algorithmic hell hole.
What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?
6 replies →
If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?
This is the conundrum social media has created. In the past only the press, who were at least semi-responsible, had the ability to spread information on a massive scale. Social media changed that. Now anyone can spread information instantly on a massive scale, and often it is the conspiracy theories and incorrect information that people seek out.
"We were a bit naive: we thought the internet, with the availability of information, would make us all a lot more factual. The fact that people would seek out—kind of a niche of misinformation—we were a bit naive." -- Bill Gates to Oprah, on "AI and the Future of us".
2 replies →
Have you heard about Tik Tok? And you think governments' intelligence agencies are not inserting their agents in key positions at bit tech companies?
Censorship is a tool to combat misinformation.
It's taking a sword to the surgery room where no scalpel has been invented yet.
We need better tools to combat dis/mis-information.
I wish I knew what that tool was.
Maybe 'inoculating information' that's specifically stickier than the dis/mis-info?
Easy solution: Repeal Section 230.
Social media platforms in the United States rely heavily on Section 230 of the Communications Decency Act, which provides them immunity from liability for most user-generated content.
3 replies →
You are on a platform that polices speech. It is evidence that policing speech helps establish civility and culture. There's nothing wrong with policing speech, but it can certainly be abused.
If you were on the early Internet, you were self policing with the help of admins all the time. The difference was you had niche populations that had a stake in keeping the peace and culture of a given board
We broke those boundaries down though and now pit strangers versus strangers for clicks and views, resulting in daily stochastic terrorism.
Police the damn speech.
For inciting violence. Sure. Free speech isn’t absolute.
But along with fringe Covid ideas, we limited actual speech on legitimate areas of public discourse around Covid. Like school reopening or questioning masks and social distancing.
We needed those debates. Because the unchecked “trust the experts” makes the experts dumber. The experts need to respond to challenges.
(And I believe those experts actually did about as best they could given the circumstances)
Try to post a meme here, see how long it stays up.
More seriously, it's just not this simple man. I know people really want it to be, but it's not.
I watched my dad get sucked down a rabbit hole of qanon, Alex Jones, anti-vax nonsense and God knows what other conspiracy theories. I showed him point blank evidence that qanon was bullshit, and he just flat out refuses to believe it. He's representative of a not insignificant part of the population. And you can say it doesn't do any damage, but those people vote, and I think we can see clearly it's done serious damage.
When bonkers ass fringe nonsense with no basis in reality gets platformed, and people end up in that echo chamber, it does significant damage to the public discourse. And a lot of it is geared specifically to funnel people in.
In more mainstream media climate change is a perfect example. The overwhelming majority in the scientific community has known for a long time it's an issue. There were disagreement over cause or severity, but not that it was a problem. The media elevated dissenting opinions and gave the impression that it was somehow an even split. That the people who disagree with climate change were as numerous and as well informed, which they most certainly weren't, not by a long shot. And that's done irreparable damage to society.
Obviously these are very fine lines to be walked, but even throughout US history, a country where free speech is probably more valued than anywhere else on the planet, we have accepted certain limitations for the public good.
If I were trying to govern during a generational, world stopping epoch event, I would also not waste time picking through the trash to hear opinions.
I would put my trust in the people I knew were trained for this and adjust from there.
I suspect many of these opinions are born from hindsight.
12 replies →
The "debate" ended up doing nothing but spreading misinformation.
Society as a whole has a responsibility to not do that kind of shit. We shouldn't be encouraging the spread of lies.
> We needed those debates. Because the unchecked “trust the experts” makes the experts dumber. The experts need to respond to challenges.
We've had these debates for decades. The end result is stuff like Florida removing all vaccine mandates. You can't debate a conspiracy or illogical thinking into to going away, you can only debate it into validity.
Really, discussion was limited? Or blatant lies were rightly excluded from discourse?
There's a big difference, and in any healthy public discourse there are severe reputations penalties for lies.
If school reopening couldn't be discussed, could you point to that?
It's very odd how as time goes on my recollection differs so much from others, and I'm not sure if it's because of actual different experiences or because of the fog of memory.
4 replies →
> Police the damn speech.
What happens when the “police” disagrees with and silences what you believe is true? Or when they allow the propagation of what you believe to be lies?
Who gets to decide what’s the truth vs. lies? The “police”?
>Who gets to decide what’s the truth vs. lies? The “police”?
This keeps coming up on this site. It seems like a basic premise for a nuanced and compassionate worldview. Humility is required. Even if we assume the best intentions, the fallible nature of man places limits on what we can do.
Yet we keep seeing posters appealing to Scientism and "objective truth". I'm not sure it is possible to have a reasonable discussion where basic premises diverge. It is clear how these themes have been used in history to support some of the worst atrocities.
Policing speech for civility or spam is very different than policing speech for content that you disagree with. I was on the early internet, and on the vast majority of forums policing someone's speech for content rather than vulgarity or spam was almost universally opposed and frowned upon.
Depends who is doing the policing. In this case, White House was telling Google who to ban.
I think it was even slightly worse. The White House was effectively delegating the decision of who to ban/police to the NIH/NIAID, an organization that was funding novel coronavirus research in Wuhan.
It's easy to see how at minimum there could be a conflict of interest.
2 replies →
You've missed the point entirely.
It’s not if Google can decide what content they want on YouTube.
The issue here is that the Biden Whitehouse was pressuring private companies to remove speech that they otherwise would host.
That's a clear violation of the first amendment. And we now know that the previous Whitehouse got people banned from all the major platforms: Twitter, YouTube, Facebook, etc.
They claim that the Biden admin pressured them to do it, except that they had been voluntarily doing it even during Trump's initial presidency.
The current administration has been openly threatening companies over anything and everything they don't like, it isn't surprising all of the tech companies are claiming they actually support the first amendment and were forced by one of the current administration's favorite scapegoats to censor things.
1 reply →
[flagged]
8 replies →
[flagged]
There is no mass Marxist movement in the USA. There is a left wing crippled by worse than useless identity politics.
[flagged]
[dead]
[flagged]
[flagged]
[flagged]
No. This perspective is wrong in both directions: (1) it is bad medicine and, (2) the medicine doesn't treat the disease. If we could successfully ban bad ideas (assuming that "we" could agree on what they are) then perhaps we should. If the damage incurred by the banning of ideas were sufficiently small, perhaps we should. But both of these are false. Banning does not work. And it brings harm. Note that the keepers of "correct speech" doing the banning today (eg in Biden's day) can quickly become the ones being banned another day (eg Trump's). It's true that drowning the truth through volume is a severe problem, especially in a populace that doesn't care to seek out truth, to find needles in haystacks. But again, banning doesn't resolve this problem. The real solution is develop a populace that cares about, seeks out, and with some skill identifies the truth. That may not be an achievable solution, and in the best case it's not going to happen quickly. But it is the only solution. All of the supply-based solutions (controlling speech itself, rather than training good listeners) run afoul of this same problem, that you cannot really limit the supply, and to the extent you can, so can your opponents.
What do you think about measures that stop short of banning? Like down ranking, demonetizing, or even hell 'banning' that just isolates cohorts that consistently violate rules?
1 reply →
No. You are objectively wrong. It's great medicine that works -- for example, in Germany, and in the US, and elsewhere, it has stemmed the flow of violent extremism historically to stop the KKK and the Nazis. You can't even become a citizen if you have been a Nazi. Even on the small scale, like reddit, banning /r/fatpeoplehate was originally subject to much handwringing and weeping by the so-called free speech absolutists, but guess what -- it all went away, and the edgelords and bullies went back to 4chan to sulk, resulting in the bullshit not being normalized and made part of polite society.
If you want to live in a society where absolutely anything goes at all times, then could I recommend Somalia?
Can we stop with the Nazi stuff. I don't know if they stopped teaching history, but there is nothing happening in the US that is within an order of magnitude of the evil the Nazi's perpetrated. Being anti-vax is not comparable to genocide.
The Nazis in 1933 hadn't done anything within an order of magnitude of the evil they would perpetrate in 1943. They nevertheless were still Nazis, and everyone who did not actively condemn them then was in part responsible for what they did later.
Many evil people weren't Nazis; some Nazis weren't necessarily evil. Evil is not part of the definition of Nazism. Promoting authoritarianism, exclusionary nationalism, institutional racism, autarky, anti-liberalism and anti-socialism are the hallmarks of Nazism. Anyone who holds the beliefs of the Nazis is a Nazi, regardless of what level of success they have to date achieved in carrying out their aims.
4 replies →
We read the history, a lot of it rhymes. Conservatives failed, and exchanged their values for a populist outsider to maintain power (see Franz von Papen). The outsider demeans immigrants and 'sexual deviants'. The outsider champions nationalism. He pardons the people who broke the law to support him. Condemns violence against the party while ignoring the more common violence coming out of the those aligned with the party. Encourages the language of enemies when discussing political opponents and protestors.
Nazi has a lot more connotations than genocide. I'm not sure it is worth nitpicking over. Even if you tone it down to Fascist or Authoritarian there will be push back.
[flagged]
1 reply →
How can you say that banning Nazis has worked well considering everything so far this year?
Europe is sliding, but has done ok so far. Crossing fingers.
Well it would if we would actually ban Nazis instead of platform them. They haven't been banned. That's the problem.
6 replies →
[flagged]
Perhaps not the wisest comment to make in light of recent events
1 reply →
The government created this problem when they enacted Section 230. This is at the root of the misinformation and disinformation... social media companies are not responsible for the harm.
The simple solution is repeal Section 230. When information can be transmitted instantly on a massive scale, somebody need to responsible for the information. The government should not police information but citizens should be allowed to sue social media companies for the harm caused to them.
The practical end result of repealing Section 230 is that companies will crack down on any even remotely controversial speech because that's the only way to avoid lawsuits.
The New York Times has published plenty of stories you could call controversial. Just this morning the top headline was that Trump lied at the UN. Trump has sued the Times for defamation, yet the paper stands by its reporting. That’s how publishing works: if you can’t defend what you publish, don’t publish it. The Section 230 debate is about whether large online platforms such as Facebook should bear similar accountability for the content they distribute. I think they should. That's the only way we can control misinformation and disinformation.
It also turns into a talking point for them. A lot of these weird conspiracies would have naturally died out if some people didn’t try to shut them down so much.
For that matter why is it even such a crazy wild idea for anybody to dare to question medicines and motives from pharmaceutical companies? Or question elections?
Both have always been massively shady. I'm old enough to remember the big stink around the Al Gore election loss, or the robust questioning of the 2016 election for that matter. So ridiculous for self-proclaimed defenders of democracy to want to ban the discussion and disagreement about the facts around elections. Democratic processes and institutions should be open to doubt, questioning, and discussion.
The response to covid vaccines was actually extremely rational. They were highly taken up by the elderly who were shown to have the greatest risk, despite that demographic skewing more conservative (and arguably could be most at risk of "misinformation" from social media). And they were not able to stop transmission or provide much benefit to children and younger people, so they didn't get taken up so much among those groups. So there was really no need for this massive sustained psychological campaign of fearmongering, divisiveness, censorship, and mandates. They could have just presented the data and the facts as they came to hand, and be done with it.
With medicine there's pushback because the vast majority of the time, someone's scamming you and you likely don't actually know what you're talking about, we had a ton of this during covid, radioactive jewelery that was supposed to protect you, cow piss (I personally know people who tried this...), 5G towers (actual damage done to all sorts of towers), Ivermectin, Hydrochloroquine and more. People who are sick or have a sick loved one are especially vulnerable to these sorts of things (there's an example of such a victim in the comments), and often end up making things worse by waiting too long or causing further damage.
With questioning elections, I think Jan 6 would be a pretty good indication of why it wasn't appropriate? This wasn't how questioning the results of elections goes in democracies. Instead, even after courts had investigated, the outgoing president refused to accept the result without any substantiated evidence.
> With medicine there's pushback because the vast majority of the time, someone's scamming you and you likely don't actually know what you're talking about, we had a ton of this during covid, radioactive jewelery that was supposed to protect you, cow piss (I personally know people who tried this...), 5G towers (actual damage done to all sorts of towers), Ivermectin, Hydrochloroquine and more. People who are sick or have a sick loved one are especially vulnerable to these sorts of things (there's an example of such a victim in the comments), and often end up making things worse by waiting too long or causing further damage.
Pushback on what? There's always been new age hippy garbage, Chinese medicine, curing cancer with berries, and that kind of thing around. I don't see that causing much damage and certainly not enough to warrant censorship. People can easily see through it and in the end they believe what they want to believe.
Far far more dangerous and the cause of real damage that I have seen come from the pharmaceutical industry and their captured regulators. Bribing medical professionals, unconscionable public advertising practices, conspiring to push opioids on the population, lying about the cost to produce medications, and on and on. There's like, a massive list of the disasters these greedy corporations and their spineless co-conspirators in government regulators have caused.
Good thing we can question them, their motives, their products.
> With questioning elections, I think Jan 6 would be a pretty good indication of why it wasn't appropriate?
I don't understand your question. Can you explain why you think Jan 6 would be a pretty good indication that discussion and disagreement about elections should be censored?
> This wasn't how questioning the results of elections goes in democracies. Instead, even after courts had investigated, the outgoing president refused to accept the result without any substantiated evidence.
I never quite followed exactly were the legal issues around that election. Trump was alleged to have tried to illegally influence some election process and/or obstructed legal transfer of power. Additionally there was a riot of people who thought Trump won and some broke into congress and tried to intimidate law makers.
I mean taking the worst possible scenario, Trump knew he lost and was scheming a plan to seize power and was secretly transmitting instructions to this mob to enter the building and take lawmakers hostage or something like that. Or any other scenario you like, let your imagination go wild.
I still fail to see how that could possibly justify censorship of the people and prohibiting them from questioning the government or its democratic processes. In fact the opposite, a government official went rogue and committed a bunch of crimes so therefore... the people should not be permitted to question or discuss the government and its actions?
There are presumably laws against those actions of rioting, insurrection, etc. Why, if the guilty could be prosecuted with those crimes, should the innocent pay with the destruction of their human rights, in a way that wouldn't even solve the problem and could easily enable worse atrocities be committed by the government in future?
Should people who question the 2024 election be censored? Should people who have concerns with the messages from the government's foremost immigration and deportation "experts" be prohibited from discussing their views or protesting the government's actions?
2 replies →