I'm very pro-vaccines, I don't think the 2020 election was stolen. But I think we have to realize silencing people doesn't work. It just causes the ideas to metastasize. A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.
The more important point (and this is really like a high school civics debate) is that the government and/or a big tech company shouldn't decide what people are "allowed" to say. There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think. People seem to forget that sometimes someone they don't agree with is in power. What if they started banning tylenol-autism sceptical accounts?
> the government and/or a big tech company shouldn't decide what people are "allowed" to say.
That "and/or" is doing a lot of work here. There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.
Then again, Alphabet is now claiming they did want to host it and mean old Biden pressured them into pulling it so if we buy that, maybe it doesn't matter.
> What if they started banning tylenol-autism sceptical accounts?
What if it's pro-cannibalism or pedophilia content? Everyone has a line, we're all just arguing about where exactly we think that line should be.
As with others, I think your "and/or" between government and "big tech" is problematic.
I think government censorship should be strictly prohibited. I think "company" censorship is just the application of the first amendment.
Where I think the problem lies with things like YouTube is the fact that we have _monopolies_, so there is no "free market" of platforms.
I think we should be addressing "big tech" censorship not by requiring tech companies to behave like a government, but rather by preventing any companies from having so much individual power that we _need_ them to behave like a government.
We should have aggressive anti-trust laws, and interoperability requirements for large platforms, such that it doesn't matter if YouTube decides to be censorious, because there are 15 other platforms that people can viably use instead.
Another way of articulating this: "concentrations of power and wealth should not determine the speech or political sentiments of the many."
My fear is that this is incredibly uncontroversial this is until it's not-- when pushes becomes shoves we start having debates about what are, "legitimate" concentrations of power (wealth) and how that legitimacy in itself lets us, "tolerate what we would generally condemn as intolerable." I feel we need to take a queue from the Chomsky's of the world and decree:
"all unjustified concentrations of power and wealth are necessarily interested in control and as such we should aggressively and purposefully refuse to tolerate them at all as a basic condition of democratic living..."
This used to be, "social democracy" where these days the Democratic Party in the United States' motto is more, "let us make deals with the devil because reasons and things." People have the power. We are the people. Hare fucking Krsna.
No one in Big Tech decides what you are allowed to say, they can only withhold their distribution of what you say.
As a book publisher, should I be required to publish your furry smut short stories? Of course not. Is that infringing on your freedom of speech? Of course not.
This is just a reminder that we're both posting on one the most heavily censored, big tech-sponsored spaces on the internet, and arguably, that's what allows for you to have your civics debate in earnest.
What you are arguing for is a dissolution of HN and sites like it.
Does Disney have a positive obligation to show animal cruelty snuff films on Disney Plus? Or are they allowed to control what people say on their network? Does Roblox have to allow XXX games showing non-consensual sex acts on their site, or are they allowed to control what people say on their network? Can WebMD decide not to present articles claiming that homeopathy is the ultimate cure-all? Does X have to share a "trending" topic about the refusal to release the Epstein files?
The reason we ban government censorship is so that a private actor can always create their own conspiracy theory + snuff film site if they want, and other platforms are not obligated to carry content they find objectionable. Get really into Rumble or Truth Social or X if you would like a very different perspective from Youtube's.
> a big tech company shouldn't decide what people are "allowed" to say
On their platform, that’s exactly what they are entitled to do. When you type into the box in the Facebook app, that’s your speech. But unless the platform wants to add your contribution to their coherent speech product, they have every right to reject it.
Otherwise, the government is deciding what people can say, and you’d be against that, right?
Further, if I wanted to start a social media platform called thinkingtylenolcausesautismisstupid.com, wouldn’t restricting my right to craft my product defeat the whole point of my business?
Giving platforms the ability to moderate their output to craft a coherent speech product is the only reason we have multiple social networks with different rules, instead of one first-mover social network with no rules where everyone is locked in by network effects.
> There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think
I've seen stupidity on the internet you wouldn't believe.
Time Cube rants — four simultaneous days in one rotation — burning across static-filled CRTs.
Ponzi pyramids stretching into forever, needing ten billion souls to stand beneath one.
And a man proclaiming he brought peace in wars that never were, while swearing the President was born on foreign soil.
All those moments… lost… in a rain of tweets.
But even that dumb stuff aside: there's two ways for a government to silence the truth: censorship, and propaganda.
We've got LLMs now, letting interested parties (government or not) overwhelm everyone with an endless barrage of the worst, cheapest, lowest quality AI slop, the kind that makes even AI proponents like me go "ah, I see what you mean about it being autocomplete", because even the worst of that by quality is still able to bury any bad news story just as effectively as any censorship. Too much noise and not enough signal, is already why I'm consuming far less YouTube these days, why I gave up on Twitter when it was still called that, etc.
And we have AI that's a lot better at holding a conversation than just the worst, cheapest, lowest quality AI slop. We've already seen LLMs are able to induce psychosis in some people just by talking to them, and that was, so far as we can tell, accidental. How long will it be before a developer chooses to do this on purpose, and towards a goal of their choice? Even if it's just those who are susceptible, there's a lot of people.
What's important is the freedom to share truth, no matter how uncomfortable, and especially when it's uncomfortable for those with power. Unfortunately, what we humans actually share the most is gossip, which is already a poor proxy for truth and is basically how all the witch hunts, genocides, and other moral-panic-induced horrors of history happened.
It is all a mess; it is all hard; don't mistake the proxy (free speech in general) for the territory (speak truth to power, I think?); censorship is simultaneously bad and the only word I know for any act which may block propaganda which is also bad.
It's not that simple. For example, when libraries remove books for political reasons they often claim it isn't "censorship" because you could buy the book at a bookstore if you wanted. But if it really would have no effect on availability they wouldn't bother to remove the book, would they?
Not OP, but we did learn the US federal government was instructing social media sites like Twitter to remove content it found displeasing. This is known as jawboning and is against the law.
SCOTUS. Bantam Books, Inc. v. Sullivan, holds that governments cannot coerce private entities into censoring speech they disfavor, even if they do not issue direct legal orders.
This was a publicly announced motivation for Elon Musk buying Twitter. Because of which we know the extent of this illegal behavior.
Mark Zuckerberg has also publicly stated Meta was asked to remove content by the US government.
I think the feeling of silencing comes from it being a blacklist and not a whitelist.
If you take proposals from whoever and then only approve ones you specifically like, for whatever reason, then I don’t think anyone would feel silenced by that.
If you take anything from anyone, and a huge volume of it, on any topic and you don’t care what, except for a few politically controversial areas, that feels more like silencing. Especially when there is no alternative service available due to network effects and subsidies from arguably monopolistic practices.
At some level these platforms are the public square and facilitate public discussion. In fact, Google has explicitly deprioritized public forum sites (e.g. PHPbb) in preference to forums like YouTube. Surely there is a difference between declining to host and distribute adult material and enforcing a preferred viewpoint on a current topic.
Sure, Google doesn't need to host anything they don't want to; make it all Nazi apologia if they thing it serves their shareholders. But doing so and silencing all other viewpoints in that particular medium is surely not a net benefit for society, independent of how it affects Google.
> My refusing to distribute your work is not "silencing."
That distinction is a relic of a world of truly public spaces used for communication— a literal town square. Then it became the malls and shopping centers, then the Internet— which runs on private pipes— and now it’s technological walled gardens. Being excluded from a walled garden now is effectively being “silenced” the same way being excluded from the town square was when whatever case law you’re thinking was decided.
> No one owes you distribution unless you have a contract saying otherwise.
The common carrier law says you have to for for some things, so it makes sense to institute such a law for some parts of social media as they are fundamental enough. It is insane that we give that much censorship power to private corporations. They shouldn't have the power to decide elections on a whim etc.
It's interesting how much "they are a private company, they can do what they want" was the talking point around that time. And then Musk bought Twitter and people accuse him of using it to swing the election or whatever.
Even today, I was listening to NPR talk about the potential TikTok deal and the commenter was wringing their hands about having a "rich guy" like Larry Ellison control the content.
I don't know exactly what the right answer is. But given their reach -- and the fact that a lot of these companies are near monopolies -- I think we should at least do more than just shrug and say, "they can do what they want."
If you refuse to distribute some information you are making editorial decision. Clearly you are reviewing all of the content. So you should be fully liable for all content that remains. Including things like libel or copyright violation.
To me that sounds only fair trade. You editorialize content. You are liable for all content. In every possible way.
Jimmy Kimmel wasn't being silenced. He doesn't have a right to a late night talk show. Disney is free to end that agreement within the bounds of their contract. Being fired for social media posts isn't being silenced. Employment is for the most part at will. Getting deported for protesting the Gaza war isn't being silenced. Visas come with limitations, and the US government has the authority to revoke your visa if you break those rules. /s
You seem to think there's a bright line of "silenced" vs "not silenced". In reality there's many ways of limiting and restricting people's expressions. Some are generally considered acceptable and some are not. When huge swaths of communication are controlled by a handful of companies, their decisions have a huge impact on what speech gets suppressed. We should interrogate whether that serves the public interest.
So you're saying that YouTube is a publisher and should not have section 230 protections?
They can't have it both ways. Sure remove content that violates policies but YouTube has long set itself up as an opinion police force, choosing which ideas can be published and monetized and which cannot.
I agree. People today are far more anti-vaccine than they were a few years ago which is kinda crazy when you consider we went through a global pandemic where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.
I think if public health bodies just laid out the data they had honestly (good and bad) and said that they think most people should probably take it, but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.
>where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.
The only reason you believe that is because all information to the contrary was systematically censored and removed from the media you consume. The actual data doesn't support that, there are even cases where it increased mortality, like https://pmc.ncbi.nlm.nih.gov/articles/PMC11278956/ and increased the chance of future covid infections, like https://pubmed.ncbi.nlm.nih.gov/39803093/ .
> but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.
Nah, the same grifters who stand to make a political profit of turning everything into a wedge issue would have still hammered right into it. They've completely taken over public discourse on a wide range of subjects, that go well beyond COVID vaccines.
As long as you can make a dollar by telling people that their (and your) ignorance is worth just as much - or more - than someone else's knowledge, you'll find no shortage of listeners for your sermon. And that popularity will build its own social proof. (Millions of fools can't all be wrong, after all.)
It's more that people in general* connect to personal stories far more than impersonal factual data. It's easy to connect to seeing people say they had adverse reactions to a vaccine than statistical data showing it's safer to get vaccinated than not.
It's also easier to believe conspiracies, its easier to think bad things happen due to the intent of bad people, than the world being a complex hard to understand place with no intent behind things happening.
These are just things that some of the population will be more attracted to, I don't think it has anything to do with censorship, lockdowns, or mandates. At most the blame can be at institutions for lacking in their ability to do effective scientific communication.
*And this skews more to less educated and intelligent.
> one of the only things that actually worked to stop people dying was the roll out of effective vaccines
"A total of 913 participants were included in the final analysis. The adjusted ORs for COVID-19 infection among vaccinated individuals compared to unvaccinated individuals were 1.85 (95% CI: 1.33-2.57, p < 0.001). The odds of contracting COVID-19 increased with the number of vaccine doses: one to two doses (OR: 1.63, 95% CI: 1.08-2.46, p = 0.020), three to four doses (OR: 2.04, 95% CI: 1.35-3.08, p = 0.001), and five to seven doses (OR: 2.21, 95% CI: 1.07-4.56, p = 0.033)." - ["Behavioral and Health Outcomes of mRNA COVID-19 Vaccination: A Case-Control Study in Japanese Small and Medium-Sized Enterprises" (2024)](https://www.cureus.com/articles/313843-behavioral-and-health...)
"the bivalent-vaccinated group had a slightly but statistically significantly higher infection rate than the unvaccinated group in the statewide category and the age ≥50 years category" - ["COVID-19 Infection Rates in Vaccinated and Unvaccinated Inmates: A Retrospective Cohort Study" (2023)](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10482361/)
"The risk of COVID-19 also varied by the number of COVID-19 vaccine doses previously received. The higher the number of vaccines previously received, the higher the risk of contracting COVID-19" - ["Effectiveness of the Coronavirus Disease 2019 (COVID-19) Bivalent Vaccine" (2022)](https://www.medrxiv.org/content/10.1101/2022.12.17.22283625v...)
"Confirmed infection rates increased according to time elapsed since the last immunity-conferring event in all cohorts. For unvaccinated previously infected individuals they increased from 10.5 per 100,000 risk-days for those previously infected 4-6 months ago to 30.2 for those previously infected over a year ago. For individuals receiving a single dose following prior infection they increased from 3.7 per 100,000 person days among those vaccinated in the past two months to 11.6 for those vaccinated over 6 months ago. For vaccinated previously uninfected individuals the rate per 100,000 person days increased from 21.1 for persons vaccinated within the first two months to 88.9 for those vaccinated more than 6 months ago." - ["Protection and waning of natural and hybrid COVID-19 immunity" (2021)](https://www.medrxiv.org/content/10.1101/2021.12.04.21267114v...)
Yes! This MUST be why the VAERS adverse event tracker went through the roof right after the rollout began, and why excess death remains sky high in many countries to this day - because a product that didn't stop you from catching or spreading the virus was one of the only things preventing deaths. Couldn't have been our, you know, immune system or anything like that, or that the average age at death was 80 along with several co-morbidities.
I feel like we're living in different worlds, because from what I've seen, giving people platforms clearly doesn't work either. It just lets the most stupid and incendiary ideas to spread unchecked.
If you allow crazy people to "let it ride" then they don't stop until... until... hell we're still in the middle of it and I don't even know when or if they will stop.
Spot on. At least in the UK, anyone who thinks fake news will just "fizzle out" on social media hasn't been paying attention to the increasing frenzy being whipped up by the alt right on Twitter and Telegram, and consequences like the Southport riots.
It's poorly thought out logic. Everyone sees how messy and how mistakes can be made when attempting to get to a truth backed by data + science, so they somehow they conclude that allowing misinformation to flourish will solve the problem instead of leading to a slow decline of morality/civilization.
Very analogous to people who don't like how inefficient governments function and somehow conclude that the solution is to put people in power with zero experience managing government.
I wonder how much of that is giving a platform to conspiracy theorists and how much of it is the social media algorithms' manipulation making the conspiracy theories significantly more visible and persuasive.
Perhaps free speech isn't the problem, but free speech x algorithmic feeds is? As we all know the algorithm favors the dramatic, controversial, etc. That creates an uneven marketplace for free speech where the most subversive and contrarian takes essentially have a megaphone over everyone else.
Glad to see this, was going to make a similar comment.
People should be free to say what they want online. But going down "YouTube conspiracy theory" rabbit holes is a real thing, and YouTube doesn't need to make that any easier, or recommend extreme (or demonstrably false) content because it leads to more "engagement".
Building on that, the crazy person spouting conspiracy theories in the town square, who would have been largely ignored in the past, suddenly becomes the most visible.
I feel that this is the right approach-- the liability and toxicity of the platforms isn't due to them being communication platforms it's because in most practical or technical ways they are not: they are deliberate behavior modification schemes where-in companies are willfully inflaming their customer's political and social sentiments for profit in exchange for access to the addictive platform. It's like free digital weed but the catch is that it makes you angry and politically divisive.
In this sense platforms like X need to be regulated more like gambling. In some ways X is a big roulette wheel that's being spun which will help stochastically determine where the next major school shooting will take place.
I think it made sense as a tactical choice at the moment, just like censorship during wartime - I dont think it should go on forever, because doing so is incompatible with a free society.
It didn't even make sense at the time. It tainted everything under a cloud that the official, accepted truth needed to suppress alternatives to win the battle of minds. It was disastrous, and it is astonishing seeing people (not you, but in these comments) still trying to paint it as a good choice.
It massively amplified the nuts. It brought it to the mainstream.
I'm a bit amazed seeing people still justifying it after all we've learned.
COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.
And to state my position like the root guy, I'm a progressive, pro-vaccine, medical science believer. I listen to my doctor and am skeptical if not dismissive of the YouTube "wellness" grifters selling scam supplements. I believe in science and research. I thought the worm pill people were sad if not pathetic. Anyone who gets triggered by someone wearing a mask needs to reassess their entire life.
But lockdowns went on way too long. Limits on behaviour went on way too long. Vaccine compliance measures were destructive the moment we knew it had a negligible effect on spread. When platforms of "good intentions" people started silencing the imbeciles, it handed them a megaphone and made the problem much worse.
And now we're living in the consequences. Where we have a worm-addled halfwit directed medicine for his child-rapist pal.
IMO free speech requires moderation, but the "how" is an unsolved problem. In a completely unmoderated environment, free speech will be drowned out by propaganda from your adversaries. The decades of experience and the industrial scale that Russian (or similar) troll factories can manufacture grassroots content or fund influencers is not something that can be combated at an individual level.
It would be a mistake to think such operations care too much about specific talking points, the goal is to drown out moderate discussion to replace it with flamewars. It's a numbers game, so they'll push in hundreds of different directions until they find something that sticks and also both sides of the same conflict.
Content that makes people angry (extreme views) brings views.
Algorithims optimise for views -> people get recommended extreme views.
You can test this with a fresh account, it doesn't take many swipes on Youtube Shorts to get some pretty heinous shit if you pretend to be a young male to the algorithm.
The best disinfectant is sunlight. I'm similarly appalled by some of the behaviour after a certain political activist was murdered, but I don't want them to get banned or deplatformed. I'm hoping what we're seeing here is a restoration of the ability to disagree with each other
Speech generally hasn’t been restricted broadly. The same concepts and ideas removed from YouTube still were available on many places (including here).
Yet we still have so many people believing falsehoods and outright lies. Even on this very topic of COVID, both sides present their “evidence” and and truly believe they are right, no matter what the other person says.
Have you actually tried to shine sunlight on online misinformation? If you do you will quickly find it doesn't really work.
The problem is simple. It is slower to produce factually correct content. A lot slower. And when you do produce something the people producing the misinformation can quickly change their arguments.
Also, by the time you get your argument out many of the people who saw the piece you are refuting and believed it won't even see your argument. They've moved on to other topics and aren't going to revisit that old one unless it is a topic they are particularly interested in. A large number will have noted the original misinformation, such as some totally unsafe quack cure for some illness that they don't currently have, accepted it as true, and then if they ever find themselves with that illness apply the quack cure without any further thought.
The debunkers used to have a chance. The scammers and bullshitters always had the speed advantage when it came to producing content but widespread distribution used to be slow and expensive. If say a quack medical cure was spreading the mainstream press could ask the CDC or FDA about it, talk to researchers, and talk to doctors dealing with people showing up in emergency rooms from trying the quack cure, and they had the distribution networks to spread this information out much faster than the scammers and bullshitters.
Now everyone has fast and cheap distribution through social media, and a large number of people only get their information from social media and so the bullshitters and scammers now have all the advantages.
How's that working out? The worst ideas of the 20th century are resurfacing in plain sunlight because the dem's couldn't pluck their heads out of the sand and actually fight them.
Now vaccines are getting banned and the GOP is gerrymandering the hell out of the country to ensure the end of the democratic process. Sure, let's do nothing and see where that brings us. Maybe people will magically come to their senses.
I agree and I’m pro vaccines but want the choice on if/when to vaccinate my kids. I believe there were election discrepancies but not sure if it was stolen. I felt the ZeroHedge article about lab leak was a reasonable possibility. All these things were shutdown by the powers that be (and this was not Trump’s fault). The people shutting down discourse are the problem.
You pretty much have the choice about vaccinating your kids. You might not be able to send them to public school without vaccinations though, depending on your local laws.
I think there's a difference between silencing people, and having an algorithm that railroads people down a polarization hole.
My biggest problem with YouTube isn't what it does/doesn't allow on its platform, it's that it will happily feed you a psychotic worldview if it keeps you on the site. I've had several family members go full conspiracy nut-job after engaging heavily with YouTube content.
I don't know what the answer is. I think many people would rightly argue that removing misinformation from the recommendation engine is synonymous with banning it. FWIW I'd be happy if recommendation engines generally were banned for being a societal cancer, but I'm probably in the minority here.
They don't silence people to stop narratives. People are silenced to cause divisions and to exert control over the population. When people stop using tech they don't control and supporting people or systems that do not have their best interests a heart, only then will we see reach change.
I used to think like you, believing that, on average, society would expurge the craziness, but the last decade and the effect of social media and the echo chambers in groups made me see that I was completely wrong.
It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.
If 'silencing people' doesn't work- so online platforms aren't allowed to remove anything? Is there any limit to this philosophy? So you think platforms can't remove:
Holocaust denial?
Clothed underage content? Reddit banned r/jailbait, but you think that's impermissible? How about clothed pictures of toddlers but presented in a sexual context? It would be 'silencing' if a platform wanted to remove that from their private property?
Bomb or weapons-making tutorials?
Dangerous fads that idiotic kids pass around on TikTok, like the blackout game? You're saying it's not permissible for a platform to remove dangerous instructionals specifically targeted at children?
How about spam? Commercial advertising is legally speech in the US. Platforms can't remove the gigantic quantities of spam they suffer from every day?
Where's the limiting principle here? Why don't we just allow companies to set their own rules on their own private property, wouldn't that be a lot simpler?
I used to believe this. But I feel more and more we need to promote a culture of free speech that goes beyond the literal first amendment. We have to tolerate weird and dangerous ideas.
> It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.
1) They are public corporations and are legal creation of the state and benefit from certain protections of the state. They also have privileged access to some public infrastructures that other private companies do not have.
2) By acting on the behest of the government they were agent of the government for free speech and censorship purposes
3) Being monopolies in their respective markets, this means they must respect certain obligations the same way public utilities have.
How about "If the content isn't illegal, then the government shouldn't pressure private companies to censor/filter/ban ideas/speech"?
And yes, this should apply to everything from criticizing vaccines, denying election results, being woke, being not woke, or making fun of the President on a talk show.
Not saying every platform needs to become like 4chan, but if one wants to be, the feds shouldn't interfere.
>A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.
Except many people don't roll their eyes at it, that's exactly the problem. QAnon went from a meme on 4chan to the dominant political movement across the US and Europe. Anti-vax went from fringe to the official policy position of the American government. Every single conspiracy theory that I'm aware of has only become more mainstream, while trust in any "mainstream" source of truth has gone down. All all of this in an environment of aggressive skepticism, arguing, debating and debunking. All of the sunlight is not disinfecting anything.
We're literally seeing the result of the firehose of misinformation and right-wing speech eating people's brains and you're saying we just have to "let it ride?"
Silencing people alone doesn't work, but limiting the damage misinformation and hate speech can do while pushing back against it does work. We absolutely do need to preserve the right of platforms to choose what speech they spread and what they don't.
Obviously, the best solution would be prevention, by having good education systems and arming common people with the weapons to assess and criticize information, but we are kinda weak on that front.
> But I think we have to realize silencing people doesn't work.
Doesn't it though? I've seen this repeated like it's fact but I don't think that's true. If you disallowed all of some random chosen conspiracy off of YouTube and other mainstream platforms I think it would stop being part of the larger public consciousness pretty quickly.
Many of these things arrived out of nothing and can disappear just as easily.
It's basic human nature that simply hearing things repeated over and over embeds it into your consciousness. If you're not careful and aware of what you're consuming then that becomes a part of your world view. The most effective way to bring people back from conspiratorial thinking (like QAnon) is to unplug them from that source of information.
These policies were put in place because the anti-vax and election skepticism content was being promoted by military intelligence organizations that were trying to undermine democracy and public healthy in the US.
The US military also promoted anti-vax propaganda in the Philippines [0].
A lot of the comments here raise good points about silencing well meaning people expressing their opinion.
But information warfare is a fundamental part of modern warfare. And it's effective.
An American company or individual committing fraud can be dealt with in the court system. But we don't yet have a good remedy for how to deal with a military power flooding social media with information intentionally designed to mislead and cause harm for people who take it seriously.
So
> I think we have to realize silencing people doesn't work
it seems to have been reasonably effective at combating disinformation networks
> It just causes the ideas to metastasize
I don't think this is generally true. If you look at old disinformation campaigns like the idea that the US faked the moon landings, it's mostly confined to a small group of people who are prone to conspiracy thinking. The idea of a disinformation campaign is you make it appear like a crazy idea has broad support. Making it appear that way requires fake accounts or at least boosters who are in on the scheme. Taking that away means the ideas compete on their own merit, and the ideas are typically real stinkers.
I think you are granting false neutrality to this speech. These misinfo folks are always selling a cure to go with their rejection of medicine.
It's a billion dollar industry built off of spreading fear and ignorance, and youtube doesn't have any obligation to host their content.
As an example, for 'curing' autism, the new grift is reject Tylenol and buy my folic acid supplement to 'fix' your child. Their stores are already open and ready.
To finish the thought, scientists at the CDC (in the before times) were not making money off of their recommendations, nor were they making youtube videos as a part of their day job. There's a deep asymmetry here that's difficult to balance if you assume the premise that 'youtube must accept every kind of video no matter what, people will sort themselves out'. Reader, they will not.
And silencing these people only lends credence to their "they don't want you to know this" conspiracy theories. Because at that point it's not a theory, it's a proven fact.
no, letting misinformation persist is counterproductive because of the illusory truth effect. the more people hear it, the more they think (consciously or not) "there must be something to this if it keeps popping up"
Elon Musk's takeover of X is already a good example of what happens with unlimited free speech and unlimited reach.
Neo-nazis and white nationalists went from their 3-4 replies per thread forums, 4chan posts, and Telegram channels, to now regularly reaching millions of people and getting tens of thousands of likes.
As a Danish person I remember how American media in the 2010s and early 2020s used to shame Denmark for being very right-wing on immigration. The average US immigration politics thread on X is worse than anything I have ever seen in Danish political discussions.
Funny thing, several person that counter responded and disagreed got grayed out (aka negative downvoted ... as in censored).
Reality is, i have personally seen what this type of uncontrolled anti-vax stuff does. The issue is that its harder to disprove a negative with a positive, then people realize.
The moment you are into the youtube, tiktok or whatever platform algorithm, you are fed a steady diet of this misinformation. When you then try to argue with actual factual studies, you get the typical response from "they already said that those studies are made up"... How do you fight that? Propaganda works by flooding the news and over time, people believe it.
That is the result of uncensored access because most people do not have the time to really look up a scientific study. The amount of negative channels massive out way positive / fact based channels because the later is "boring". Its the same reason why your evening news is 80% deaths, corruptions, thefts, politicians and taxes or other negative world news. Because it has been proven that people take in negative news much more. Clickbait titles that are negative draw in people.
There is a reason why holocaust denial is illegal in countries. Because the longer some people can spew that, the more people actually start to believe it.
Yes, i am going to get roasted for this but people are easily influenced and they are not as smart as they think themselves are. We have platforms that cater to people's short attention span with barely 1~3 min clips. Youtube video's longer then 30min are horrible for the youtubers income as people simply do not have the attention span and resulting lost income.
Why do we have laws like seatbelt, speed limits, and other "control" over people. Because people left to their own devices, can be extreme uncaring about their own family, others, even themselves.
Do i like the idea of censorship for the greater good, no. But when there are so many that spew nonsense just to sell their powders, and their homemade vitamine C solutions (made in China)... telling people information that may hurt or kills themselves, family or others.
Where is the line of that unbridled free speech? Silencing people works as in, your delaying the flow of shit running down a creek, will it stop completely? No, but the delay helps people downstream. Letting it run uninterrupted, hoping that a few people downstream with a mop will do all the work, yea ...
We only need to look at platforms like X, when "censorship" got removed (moderation). Full of free speech, no limits and it turned into a soak pit extreme fast (well, bigger soak pit).
Not sure why i am writing this because this is a heated topic but all i can say is, I have seen the damage that anti-vax did on my family. And even to this day, that damage is still present. How a person who never had a issue with vaccinations, never had a bad reaction beyond the sour arm for a day, turned so skeptical to everything vaccination. All because those anti-vax channels got to her.
The anti-vax movement killed people. There is scientific study upon study how red states in the US ended up with bigger amounts of deaths given the time periodes. And yet, not a single person was ever charged for this, ... all simply accepted this and never looked back. Like it was a natural thing, that people's grandparents, family members died that did not need to die.
The fact that people have given up, and now accept to let those with often financial interests, spew nonsense as much as they like. Well, its "normal".
I weep for the human race because we are not going to make it.
I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?
Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?
You are on a platform that polices speech. It is evidence that policing speech helps establish civility and culture. There's nothing wrong with policing speech, but it can certainly be abused.
If you were on the early Internet, you were self policing with the help of admins all the time. The difference was you had niche populations that had a stake in keeping the peace and culture of a given board
We broke those boundaries down though and now pit strangers versus strangers for clicks and views, resulting in daily stochastic terrorism.
For inciting violence. Sure. Free speech isn’t absolute.
But along with fringe Covid ideas, we limited actual speech on legitimate areas of public discourse around Covid. Like school reopening or questioning masks and social distancing.
We needed those debates. Because the unchecked “trust the experts” makes the experts dumber. The experts need to respond to challenges.
(And I believe those experts actually did about as best they could given the circumstances)
What happens when the “police” disagrees with and silences what you believe is true? Or when they allow the propagation of what you believe to be lies?
Who gets to decide what’s the truth vs. lies? The “police”?
Policing speech for civility or spam is very different than policing speech for content that you disagree with. I was on the early internet, and on the vast majority of forums policing someone's speech for content rather than vulgarity or spam was almost universally opposed and frowned upon.
It’s not if Google can decide what content they want on YouTube.
The issue here is that the Biden Whitehouse was pressuring private companies to remove speech that they otherwise would host.
That's a clear violation of the first amendment. And we now know that the previous Whitehouse got people banned from all the major platforms: Twitter, YouTube, Facebook, etc.
No. This perspective is wrong in both directions: (1) it is bad medicine and, (2) the medicine doesn't treat the disease. If we could successfully ban bad ideas (assuming that "we" could agree on what they are) then perhaps we should. If the damage incurred by the banning of ideas were sufficiently small, perhaps we should. But both of these are false. Banning does not work. And it brings harm. Note that the keepers of "correct speech" doing the banning today (eg in Biden's day) can quickly become the ones being banned another day (eg Trump's). It's true that drowning the truth through volume is a severe problem, especially in a populace that doesn't care to seek out truth, to find needles in haystacks. But again, banning doesn't resolve this problem. The real solution is develop a populace that cares about, seeks out, and with some skill identifies the truth. That may not be an achievable solution, and in the best case it's not going to happen quickly. But it is the only solution. All of the supply-based solutions (controlling speech itself, rather than training good listeners) run afoul of this same problem, that you cannot really limit the supply, and to the extent you can, so can your opponents.
Can we stop with the Nazi stuff. I don't know if they stopped teaching history, but there is nothing happening in the US that is within an order of magnitude of the evil the Nazi's perpetrated. Being anti-vax is not comparable to genocide.
The government created this problem when they enacted Section 230. This is at the root of the misinformation and disinformation... social media companies are not responsible for the harm.
The simple solution is repeal Section 230. When information can be transmitted instantly on a massive scale, somebody need to responsible for the information. The government should not police information but citizens should be allowed to sue social media companies for the harm caused to them.
The practical end result of repealing Section 230 is that companies will crack down on any even remotely controversial speech because that's the only way to avoid lawsuits.
It also turns into a talking point for them. A lot of these weird conspiracies would have naturally died out if some people didn’t try to shut them down so much.
For that matter why is it even such a crazy wild idea for anybody to dare to question medicines and motives from pharmaceutical companies? Or question elections?
Both have always been massively shady. I'm old enough to remember the big stink around the Al Gore election loss, or the robust questioning of the 2016 election for that matter. So ridiculous for self-proclaimed defenders of democracy to want to ban the discussion and disagreement about the facts around elections. Democratic processes and institutions should be open to doubt, questioning, and discussion.
The response to covid vaccines was actually extremely rational. They were highly taken up by the elderly who were shown to have the greatest risk, despite that demographic skewing more conservative (and arguably could be most at risk of "misinformation" from social media). And they were not able to stop transmission or provide much benefit to children and younger people, so they didn't get taken up so much among those groups. So there was really no need for this massive sustained psychological campaign of fearmongering, divisiveness, censorship, and mandates. They could have just presented the data and the facts as they came to hand, and be done with it.
With medicine there's pushback because the vast majority of the time, someone's scamming you and you likely don't actually know what you're talking about, we had a ton of this during covid, radioactive jewelery that was supposed to protect you, cow piss (I personally know people who tried this...), 5G towers (actual damage done to all sorts of towers), Ivermectin, Hydrochloroquine and more. People who are sick or have a sick loved one are especially vulnerable to these sorts of things (there's an example of such a victim in the comments), and often end up making things worse by waiting too long or causing further damage.
With questioning elections, I think Jan 6 would be a pretty good indication of why it wasn't appropriate? This wasn't how questioning the results of elections goes in democracies. Instead, even after courts had investigated, the outgoing president refused to accept the result without any substantiated evidence.
> From President Biden on down, administration officials “created a political atmosphere that sought to influence the actions of platforms based on their concerns regarding misinformation,” Alphabet said, claiming it “has consistently fought against those efforts on First Amendment grounds.”
This actually surprised me because I thought (and maybe still think) that it was Google employees that led the charge on this one.
I wouldn't trust any public statement from these companies once that kind of threat has been thrown around. People don't exactly want to go to prison forever.
It was. At the time, they felt like they were doing the right thing -- the heroic thing, even -- in keeping dangerous disinformation away from the public view. They weren't shy about their position that censorship in that case was good and necessary. Not the ones who said it on TV, and not the ones who said it to me across the dinner table.
For Google now to pretend Biden twisted their arm is pretty rich. They'd better have a verifiable paper trail to prove that, if they expect anyone with a memory five years long to believe it.
They don't need a paper trail. Conservatives will believe anything damning they see about liberals. Just vague accusations or outright lies work plenty well to keep conservatives foaming at the mouth over imagined issues.
It's been known for years that the White House was pressuring Google on this. One court ordered them to cease temporarily. I wanted to link the article, but it's hard to find because of the breaking news.
At the time, YouTube said: “Anything that would go against World Health Organization recommendations would be a violation of our policy.” [1] which, in my opinion, is a pretty extreme stance to take, especially considering that the WHO contradicted itself many times during the pandemic.
> the WHO contradicted itself many times during the pandemic
Did they? I remember them revising their guidance, which seems like something one would expect during an emerging crisis, but I don't remember them directly contradicting themselves.
Of course, if we just take the most recent thing they said as "revised guidance", I guess it's impossible for them to contradict themselves. Just rapidly re-re-re-revised guidance.
Don't forget that they ban-hammered anyone who advanced the lab leak theory because a global entity was pulling the strings at the WHO. I first heard about Wuhan in January of 2020 from multiple Chinese nationals who were talking about the leak story they were seeing in uncensored Chinese media and adamant that the state media story was BS. As soon as it blew up by March, Western media was manipulated into playing the bigotry angle to suppress any discussion of what may have happened.
I believe having Trump as president exacerbated many, many things during that time, and this is one example. He was quick to start blaming the "Chinese", he tried to turn it into a reason to dislike China and Chinese people, because he doesn't like China, and he's always thinking in terms of who he likes and dislikes. This made it hard to talk about the lab leak hypothesis without sounding like you were following Trump in that. If we had had a more normal president, I don't think this and other issues would have been as polarized, and taking nuanced stances would have been more affordable.
Because that is a bold claim to make. There is no proof of a lab leak and evidence leads to the wet market as the source. There is a debate out there for 100k to prove this. Check it out.
Fauci was trying to prevent a run on masks, which he believed were needed by the health care workers. So he probably justified his lie to the US to himself because it was for the "greater good" (The ends justify the means is not my view BTW).
It turns out that masks ARE largely ineffective at preventing CoViD infection. It's amazing how many studies have come up with vastly different results.
I think the problem is that apparently some people discovered there is a profitable business model in spreading misinformation, so a trustful (even if not always right), non malicious, reference of information might be needed.
it was an extreme time, but yes, probably the most authoritarian action I've seen social media take.
misinformation is a real and worsening problem, but censorship makes conspiracies flourish, and establishes platforms as arbiters of truth. that "truth" will shift with the political tides.
IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons.
This just seems incredibly difficult. Even between people who are highly intelligent, educated, and consider themselves to be critical thinkers there can be a huge divergence of what "truth" is on many topics. Most people have no tools to evaluate various claims and it's not something you can just "teach kids". Not saying education can't move the needle but the forces we're fighting need a lot more than that.
I think some accountability for platforms is an important part of this. Platforms right now have the wrong incentives, we need to fix this. It's not just about "truth" but it's also about stealing our attention and time. It's a drug and we should regulate it like the drug it is.
As I recall from my school days, in Social Studies class there were a set of "Critical Thinking" questions at the end of every chapter in the textbook. Never once were we assigned any of those questions.
Some of the worst examples of viral misinformation I've encountered were image posts on social media. They'll often include a graph, a bit of text and links to dense articles from medical journals. Most people will give up at that point and assume that it's legit because the citations point to BMJ et el. You actually need to type those URLs into a browser by hand, and assuming they go anywhere leverage knowledge taught while studying university level stats.
I spent several hours on one of these only to discover the author of the post had found a subtle way to misrepresent the findings and had done things to the graph to skew it further. You cannot expect a kid (let alone most adults) to come to the same conclusion through lessons on critical thinking.
> "IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons."
You just described a perfectly normal "Civics & Current Events" class in early grade-school back when / where I grew up. We were also taught how to "follow the facts back to the actual sources" and other such proper research skills. This was way back when you had to go to an actual library and look up archived newspapers on microfiche, and encyclopedias were large collections of paper books. Y'know... When dinosaurs still roamed the streets... ;)
> IMO we need to teach kids how to identify misinformation in school.
This is extremely difficult. Many of the people who thrive on disinformation are drawn to it because they are contrarian. They distrust anything from the establishment and automatically trust anything that appears anti-establishment. If you tell them not to trust certain sources that’s actually a cue to them to explore those sources more and assume they’re holding some valuable information that “they” don’t want you to know.
The dynamics of this are very strange. A cluster of younger guys I know can list a dozen different times medical guidance was wrong in history from memory (Thalidomide, etc), but when you fact check Joe Rogan they laugh at you because he’s a comedian so you can’t expect him to be right about everything. “Do your own research” is the key phrase, which is a dog whistle to mean find some info to discount the professionals but then take sources like Joe Rogan and his guests at face value because they’re not the establishment.
> wasn't Google/Youtube banning so much as government ordering private companies to do so
No, it was not. It’s particularly silly to suggest this when we have live example of such orders right now.
The companies were nudged. (And they were wrong to respond to public pressure.) The President, after all, has a “bully pulpit.” But there were no orders, no credibly threats and plenty of companies didn’t deplatform these folks.
We live in a complicated world, and we do need the freedom to get things right and wrong. Never easy though in times of crisis.
Silver lining in this is the conversation continued and will continue. I can see governments needing to try to get accurate and helpful information out in crisis - and needing to pressure or ask more of private companies to do that. But also like that we can reflect back and go - maybe that didn’t work like what we wanted or maybe it was heavy-handed.
In many governments, the government can do no wrong. There are no checks and balances.
The question is - should we still trust YouTube/Google? Is YouTube really some kind of champion of free speech? No. Is our current White House administration a champion of free speech? Hardly.
But hopefully we will still have a system that can have room for critique in the years to come.
If anything, I think we're even closer. It feels like the current administration is stifling speech more than ever. It's open season on people who don't proudly wave the flag or correctly mourn Charlie Kirk. People who dare speak against Israel are being doxxed and in some cases hounded out of their jobs. Books are being taken off library shelves on the whim of a very few community members with objections. And all of it is getting a giant stamp of approval from the White House.
Its odd. People on HN routinely complain how Stripe or PayPal or some other entity banned them unfairly and the overwhelming sentiment is that it was indeed unfair.
But when it comes to this thread, the sentiment mostly is banning is good and we should trust Google made the right choice.
I think it would be wise to listen to Nobel Prize-winning journalist Maria Ressa of The Philippines, regarding unchecked social media.
"You and I, if we say a lie we are held responsible for it, so people can trust us. Well, Facebook made a system where the lies repeated so often that people can't tell."
"Both United Nations and Meta came to the same conclusion, which is that this platform Facebook actually enabled genocide that happened in Myanmar. Think about it as, when you say it a million times... it is not just the lie but also it is laced with fear, anger and hate. This is what was prioritized in the design and the distribution on Facebook. It keeps us scrolling, but in countries like Myanmar, in countries like Philippines, in countries where institutions are weak, you saw that online violence became real world violence."
"Fear, anger, hate, lies, salaciousness, this is the worst of human nature... and I think that's what Big Tech has been able to do through social media... the incentive structure is for the worst of who we are because you keep scrolling, and the longer you keep scrolling the more money the platform makes."
"Without a shared reality, without facts, how can you have a democracy that works?"
"Beware of he who would deny you access to information for in his heart he dreams himself your master." - Commissioner Pravin Lal, U.N. Declaration of Rights
Full quote: "As the Americans learned so painfully in Earth's final century, free flow of information is the only safeguard against tyranny. The once-chained people whose leaders at last lose their grip on information flow will soon burst with freedom and vitality, but the free nation gradually constricting its grip on public discourse has begun its rapid slide into despotism. Beware of he who would deny you access to information, for in his heart he deems himself your master."
There is a difference between free flow of information and propaganda. Much like how monopolies can destroy free markets, unchecked propaganda can bury information by swamping it with a data monoculture.
I think you could make a reasonable argument that the algorithms that distort social media feeds actually impede the free flow of information.
That sounds great in the context of a game, but in the years since its release, we have also learned that those who style themselves as champions of free speech also dream themselves our master.
They are usually even more brazen in their ambitions than the censors, but somehow get a free pass because, hey, he's just fighting for the oppressed.
Not in the original statement, but as it referenced here, the word 'information' is doing absolutely ludicrous amounts of lifting. Hopefully it bent at the knees, because it my book it broke.
You can't call the phrase "the sky is mint chocolate chip pink with pulsate alien clouds" information.
We are not controlling people by reducing information.
We are controlling people by overwhelming them in it.
And when we think of a solution, our natural inclination to “do the opposite” smacks straight into our instinct against controlling or reducing access to information.
The closest I have come to any form of light at the end of the tunnel is Taiwan’s efforts to create digital consultations for policy, and the idea that facts may not compete on short time horizon, but they surely win on longer time horizons.
Beware he who would tell you that any effort at trying to clean up the post apocalyptic wasteland that is social media is automatically tyranny, for in his heart he is a pedophile murderer fraudster, and you can call him that without proof, and when the moderators say your unfounded claim shouldn't be on the platform you just say CENSORSHIP.
The thing is that burying information in a firehose of nonsense is just another way of denying access to it. A great way to hide a sharp needle is to dump a bunch of blunt ones on top of it.
Imagine an interface that reveals the engagement mechanism by, say, having an additional iframe. In this iframe an LLM clicks through its own set of recommendations picked to minimize negative emotions at the expense of engagement.
After a few days you're clearly going to notice the LLM spending less time than you clicking on and consuming content. At the same time, you'll also notice its choices are part of what seems to you a more pleasurable experience than you're having in your own iframe.
Social media companies deny you the ability to inspect, understand, and remix how their recommendation algos work. They deny you the ability to remix an interface that does what I describe.
In short, your quote surely applies to social media companies, but I don't know if this is what you originally meant.
There's a special irony in this being the top comment on a site where everyone has a rightthink score and people routinely and flagrantly engage in "probably bad faith, but there's plausible deniability so you can't pin it on them" communication to crap on whatever the wrongthink on an issue is.
As bad as facebook and it's opaque algorithms that favor rage bait are, the kind of stuff you get by keeping score is worse.
>"You and I, if we say a lie we are held responsible for it, so people can trust us."
I don't know how it works in The Philippines, but in the USA the suggestion that media outlets are held responsible for the lies that they tell is one of the most absurd statements one could possibly make.
Say what you will about the CCP, it's naive to let a foreign nation have this much impact on your subjects. The amount of poison and political manipulation that are imported from these platform is astronomical.
I don't see it so much as protecting people from bad information as protecting people from bad actors, among whom entities like Facebook are prominent. If people want to disseminate quackery they can do it like in the old days by standing on a street corner and ranting. The point is that the mechanisms of content delivery amplify the bad stuff.
Censorship works both ways. When i tried speaking against violence and genocide perpetrated by Russia in Ukraine i was shut down on Linkedin.
Even here on HN, i was almost banned when i said about children abduction by Russia https://news.ycombinator.com/item?id=33005062 - the crime that half year later ICC wrote the order against Putin.
You know how this used to work in the old days? Instead of publishing allegations yourself, you would take your story to a newspaper reporter. The reporter will then do the investigations then, if there is solid evidence, the story will be published in the newspaper. At that point the newspaper company is standing behind the story, and citizens know the standing of the newspaper in their community, and how much credence to give to the story, based on that. Social media destroyed this process, now anyone can spread allegations at lightning speed on a massive scale without any evidence to back it up. This has to stop. We should return to the old way, it wasn't perfect, but it worked for 100s of years. Repealing Section 230 will accomplish this.
That's the evil genius behind the general movement in the world to discredit democratic institutions and deflate the government.
Who would hold Meta accountable for the lies it helps spread and capitalize upon them if not the government.
So by crippling democratic institutions and dwarfing the government to the point of virtual non-existence, all in the name of preserving freedom of speech and liberalism -- and in the process subverting both concepts -- elected leaders have managed to neutralize the only check in the way of big corps to ramp up this misinformation machine that the social networks have become.
I think it would be even wiser to start by holding to account the politicians, corporations, and government institutions regarding their unchecked lies corruption and fraud.
But no, yet again the blame is all piled on to the little people. Yes, it's us plebs lying on the internet who are the cause of all these problems and therefore we must be censored. For the greater good.
I have an alternative idea, let's first imprison or execute (with due process) politicians, CEOs, generals, heads of intelligence and other agencies and regulators, those found to have engaged in corrupt behavior, lied to the public, committed fraud, insider trading, fabricated evidence to support invading other countries, engage in undeclared wars, ordered extrajudicial executions, colluded with foreign governments to hack elections, tax evasion, etc. Then after we try that out for a while and if it has not improved things, then we could try ratcheting up the censorship of plebs. Now one might argue that would be a violation of the rights of those people to take such measures against them, but that is a sacrifice I'm willing to make. Since We Are All In This Together™, they would be willing to make that sacrifice too. And really, if they have nothing to hide then they have nothing to fear.
When you get people like Zuckerberg lying to congress, it's pretty difficult to swallow the propaganda claiming that it's Joe Smith the unemployed plumber from West Virginia sharing "dangerous memes" with his 12 friends on Facebook that is one of the most pressing concerns.
All those words, and no mention of Section 230, which is what this is really all about. Google can see which way the wind is blowing and they know POTUS will -- for better or worse -- happily sign any anti-"Big Tech censorship" bill that gets to his desk. They hope to preempt this.
The problem with any system like this is that due to scale it will be automated which means a large swath of people will be caught up in it doing nothing wrong.
This is why perma bans are bad. Id rather a strike system before a temp ban to give some breathing room for people to navigate the inevitable incorrect automation. Even then if the copyright issue is anything to go by this is going to hurt more than help.
A lot of channels had to avoid even saying the word Covid. I only saw it return recently to use at the end of last year. There were a variety of channels banned that shouldn't have been such as some talking about Long Covid.
Now you see channels avoiding saying "Gaza" or "genocide". I haven't seen any proof platforms are censoring at least some content related to Israel but I wouldn't be surprised.
According to Google's censorship algorithm, Michael Osterholm's podcast (famous epidemiologist and, at the time, a member of President Biden's own gold-star covid-19 advisory panel).
Most ironic thing I've ever seen. I still recall it perfectly, though it's been four years. Never, ever trust censorship algorithms or the people who control them: they are just dumb parrots that suppress all discussion of an unwanted topic, without thought or reason.
My wake up moment was when they not only took down a Covid debate with a very well qualified virologist, but also removed references to it in the Google search index, not just for the YouTube link.
It seems to me that a lot of people are missing the forest for the trees on misinformation and censorship. IMO, a single YouTube channel promoting misinformation, about Covid or anything else, is not a huge problem, even if it has millions of followers.
The problem is that the recommendation algorithms push their viewers into these echo chambers that are divorced from reality where all they see are these videos promoting misinformation. Google's approach to combating that problem was to remove the channels, but the right solution was, and still is today, to fix the algorithms to prevent people from falling into echo chambers.
I've argued this before, but the algorithms are not the core problem here.
For whatever reason I guess I'm in that very rare group that genuinely watches everything from far-right racists, to communists, to mainstream media content, to science educational content, to conspiracy content, etc.
My YT feed is all over the place. The algorithms will serve you a very wide range of content if you want that, the issue is that most people don't. They want to hear what they already think.
So while I 100% support changing algorithms to encourage more diversity of views, also I think as a society we need to question why people don't want to listen to more perspectives naturally? Personally I get so bored here people basically echo what I think. I want to listen to people who say stuff I don't expect or haven't thought about before. But I'm in a very significant minority.
I might agree that the algos making recommendations on the sidebar might not matter much, but the algos that control which videos show up when you search for videos on Google, and also in YouTube search absolutely do matter.
The problem with this is that a lot of people have already fallen into these misinformation echo chambers. No longer recommending them may prevent more from becoming unmoored from reality, but it does nothing for those currently caught up in it. Only removing the channel helps with that.
I don't think those people caught up in it are suddenly like "oop that YouTuber is banned, I guess I don't believe that anymore". They'll seek it out elsewhere.
Yeah, there are two main things here that are being conflated.
First, there's YouTube's decision of whether or not to allow potentially dangerous misinformation to remain on their site, and whether the government can or did require them to remove it.
Second, though, there's YouTube's much stronger editorial power: whether or not to recommend, advertise, or otherwise help people discover that content. Here I think YouTube most fairly deserves criticism or accolades, and it's also where YouTube pretends that the algorithm is magic and neutral and they cannot be blamed for actively pushing videos full of dangerous medical lies.
Why. Why is Google obligated to publish your content? Should Time Magazine also give you a column because they give others space in their pages? Should Harvard Press be required to publish and distribute your book because they do so for others.
These companies owe you nothing that's not in a contract or a requirement of law. That you think they owe you hosting, distribution, and effort on their algorithm, is a sign of how far off course this entire discourse has moved.
The problem is that misinformation has now become information, and vice versa, so who was anyone to decide what was misinformation back then, or now, or ever.
I like the term disinformation better, since it can expand to the unfortunately more relevant dissenting information.
The algorithm doesn't push anyone. It just gives you what it thinks you want. If Google decided what was true and then used the algorithm to remove what isn't true, that would be pushing things. Google isn't and shouldn't be the ministry of truth.
Exactly, they shouldn't be the ministry of truth. They should present balanced viewpoints on both sides of controversial subjects. But that's not what they're doing right now. If you watch too many videos on one side of a subject it will just show you more and more videos reinforcing that view point because you're likely to watch them!
"what it thinks you want" is doing a lot of work here. why would it "think" that you want to be pushed into an echo chamber divorced from reality instead of something else? why would it give you exactly what you "want" instead of something aligned with some other value?
Canada has a tyrannical style government that has been censoring speech. I had a discussion recently with a liberal who was arguing that it's a good thing the government is censoring the speech of their political opponents. That free speech comes with consequences.
My argument, free speech is a limit on the government. Give them as much consequences you please but not with government power.
That's the problem here, Democrats were using government power to censor their political opponents; but they wouldnt have been able to do it without government power.
Without over-doing it, as a non-american, not resident in the USA, It is so very tempting to say "a problem of your making" -but in truth, we all have a slice of this because the tendency to conduct state policy by mis-truths in the media is all-pervasive.
So yes. This is a problem rooted in the USA. But it is still a problem, and it's a problem for everyone, everywhere, all the time.
On Reddit, you can get banned from some subreddits simply because you have posted in another completely different sub (regardless of the content of the post).
It's not even always politics, although that's certainly a major driving force. But then you have really stupid fights like two subs about the same topic banning each others' members.
Everybody here is strangers online, so I think grains of salt are reasonable all around. That said, I'm not sure that people-who-were-banned deserve above average scrutiny. Anecdotally, a lot of the RubyGems maintainers were banned a week ago. It seems really unfair to distrust people _just_ because a person-in-control banned them.
The problem (?) with Reddit is that the users themselves have a lot more control over bans than on other social media where it is the platform themselves that do the banning. This makes bans much more arbitrary even than on Facebook and et al.
Reddit (both admins and many subreddit moderators) are extremely trigger happy with bans. Plenty of reasonable people get banned by capricious Reddit mods.
That's the funny thing about reddit. You can get banned trivially on a whim of a mod. I've been banned from multiple subreddits that I've never been to. Simply because I posted on another subreddit and that mod found detestable.
My favourite. I'm trans/autistic. I was posting on r/autism being helpful. OP never mentioned their pronouns, just that they have a obgyn and feminine problems. I replied being helpful. but I misgendered them and they flipped out. Permabanned me from r/asktransgender, even though i never posted on it. Then left me a pretty hateful reply on r/autism. Reddit admins give me a warning for hate toward trans people. Despite me never doing any such thing and being one.
Right about the same time r/askreddit had a thread about it being hard not to misgender trans. So i linked this thread, linking an imgur of the reddit admin warning. I went to like 30,000 upvotes. r/autism mods had to reply saying they dont see any hate in my post and that people should stop reporting it.
I was banned because I was simply in a covid sub debating with the covid-deniers. The "powers-that-be" mods literally banned anyone on that particular sub from popular subs, some of which I hadn't even been in, ever. There was (is?) a cabal of mods on there that run the most popular subs like pics/memes/etc that definitely are power hungry basement dwellers that must not have a life.
Eh, I was banned from several major subreddits for simply posting in a conversative subreddit, even though my post was against the conservative sentiment.
I think you have a shallow understanding of both free speech and free will if you think this is the gotcha you seem to think it is. Why couldn't people have discussions about free will in a determinist universe? They could be weaved by the laws of physics into having them.
As for free speech online, do you think there should be no limit to what can be said or shared online? What about pedophilia or cannibalism? Or, more relevantly, what about election-denialism, insurrectionism or dangerous health disinformation that are bound to make people act dangerously for themselves and society as a whole? Point is, free speech is never absolute, and where the line is drawned is an important conversation that must be had. There is no easy, objective solution to it.
Social media and lack of scientific research literacy is going to eventually prove to be fatal for modern society, even with this Tylenol thing, I have on one side people that believe a study blindly without reading that it's not taking into consideration several important variables and more studies are needed, and on the other hand I have people that did not read at all the study saying that it's impossible Tylenol could be causing anything because it is the only pain med pregnant women can take... clear non understanding of how controlled trials work...
Same thing with the UFO "Alien" video that was "shot down" by a hellfire missile (most likely a balloon), people just automatically assume that because it was said in congress it has to be true, zero analysis whatsoever of the footage or wanting to seek analysis by an expert, nope, it must be an alien.
There is so much misinformation, so much lack of understanding, and so many people, from every side that just have complete and utter lack of understanding of how seemingly basic things work, I am afraid for the future.
But yeah! let's unban unscientific sources, oh and people who are okay with a literal coup on a democracy.
Prediction, nobody will be unbanned because they'll all be found to have committed other bannable offenses. Youtube gives Trump a fake win while actually doing nothing.
More speech! The signal vs. noise-ratio shifts. So access to information will become more difficult. More disinformation and outright nonsense will make it more difficult to get to the valuable stuff.
Ok - let‘s see how that works!
I'm shocked at how often people flip-flop their arguments when discussing private entities censoring speech. It's frustrating because it feels like the only speech allowed today is right-wing commentary. When Democrats were in power, it seemed like only left-wing commentary was permitted. It's baffling that, despite our education, we're missing the point and stuck in this polarized mess.
I'm not sure why they would, it's kind of a dumb move. They aren't violating anyone's freedom of speech by banning disinformation and lies. It's a public service, those people can head on over to one of the many outlets for that stuff. This is definitely a black mark on YouTube.
There isn't really a good solution here. A precedent for banning speech isn't a good one, but COVID was a real problem and misinformation did hurt people.
The issue is that there is no mechanism for punishing people who spread dangerous misinformation. It's strange that it doesn't exist though, because you're allowed to sue for libel and slander. We know that it's harmful, because people will believe lies about a person, damaging their reputation. It's not clear why it can't be generalized to things that we have a high confidence of truth in and where lying is actively harmful.
No speech was banned. Google didn't prevent anyone from speaking. They simply withheld their distribution. No one can seem to get this right. Private corporations owe you almost nothing and certainly not free distribution.
In the article it mentions that Google felt pressured by the government to take the content down. Implying that they wouldn't have if it wasn't for the government. I wasn't accusing Google of anything, but rather the government.
Maybe it's not banning, but it doesn't feel right? Google shouldn't have been forced to do that and really what should've happened is that the people that spread genuine harmful disinformation, like injecting bleach, the ivermectin stuff or the anti-vax stuff, should've faced legal punishment.
It's interesting you say that, because the government is saying Tylenol causes autism in infants when the mother takes it. The original report even says more verification is required and it's results are inconclusive.
I wouldn't be surprised if some lawsuit is incoming from the company that manufactures it.
We have mechanisms for combatting the government through lawsuits. If the government came out with lies that actively harm people, I hope lawsuits come through or you know... people organize and vote for people who represent their interests.
Virtually all of the supposed misinformation turned out not to be that at all. Period, the end. All the 'experts' were wrong, all those that we banned off platforms (the actual experts) were right
Misinformation, disinformation, terrorism, cancel culture, think of the children, fake news, national security, support our troops, and on and on. These will be used to justify censorship. Those who support it today may find out it's used against them tomorrow.
i'd like to think that if I were a YTer that got banned for saying something that I believed in that I would at least have the dignity not to take my value back to the group that squelched me.
If you (or anyone) run across a flagged comment that isn't tediously repeating ideological battle tropes, pushing discussion flameward, or otherwise breaking the site guidelines, you're welcome to bring it to our attention. So far, the flagged comments I've seen in this thread seem correctly flagged. But we don't see everything.
On this site, we're trying for discussion in which people don't just bash each other with pre-existing talking points (and unprocessed rage). Such comments quickly flood the thread on a divisive topic like this one, so flagging them is essential to having HN operate as intended. To the extent possible at least.
(oh and btw, though it ought to go without saying, this has to do with the type of comment, not the view it's expressing. People should be able to make their substantive points thoughtfully, without getting flagged.)
Flagging isn’t the worst that can happen, you could also be rate limited what prevents you from answering in a discussion because of „you are posting too fast“
Yeah but in practice this isn't actually the case, people flag all the time for people just having a dissenting opinion, fitting none of the categories you mentioned
They never forced them, and they certainly never said "that's a nice merger you got there, it would be awful if something were to happen to it" per the current policies of the US government.
They are mega-corporations. They always do what ever the hell they want, certainly absent your input. Did you really believe they don't do what ever they want, because that's pretty damned naive.
Promoting medical misinformation or even health misinformation should be critically judged. Alternative health companies are rubbing their hands together.
The next Drain-o chug challenge "accident" is inevitable, at this rate.
Steelman argument is it's better to know what liars, bigots, and other naughty people are up to than push them entirely underground. And someday future moderators may think you're naughty/lying/a quack/etc.
IMO we should not let private platforms become near monopolies, and certainly not without regulation, since they become a defacto public square. But if we're going to let them eat the world, then hopefully they'll at least use good judgment and measures like de-ranking or even banning folks who encourage others to do harm. Making bans temporary is a safety valve in case of bad moderation.
That steelman is still a pretty bad argument, though. I don't see why giving liars, bigots and other naughty people a megaphone is required in order to know what they're saying.
What is Youtube a 'near monopoly' in? Online video.....? Do you have any idea how much video there is online that's not on Youtube? They don't meet the legal definition of a monopoly
Earlier in 2025, the video game Fortnite announced[1] that they were giving cheaters with lifetime bans a "second chance" and let them return to the game. Lo and behold, cheating in the game spiked up this year and has returned as a huge ongoing problem. Turns out, the vast majority of the bans were probably correct, and when you let people back into something who were banned for doing X, they're going to immediately start doing X again once they're back in.
Admittedly, Google was very heavy handed with Covid censorship. Sure, there was a lot of genuine misinformation that maybe deserved it, but they also tended to catch a lot of actual qualified scientists engaging in scientific debate (say, arguing in favor of masks and the transmission through air theory in the early days) or even some discussion that wasn't opposing the official stances.
Somewhat related, it's pretty insane how even to this day YouTubers have to avoid referring to by name a global multi-year situation that everyone who existed at the time went through. It's due to advertisers rather than government pressure, but still, insane.
Yeah at the time I get the impression they were banning dissent, not just egregious or dangerous content (whatever that even means). I though most places came to their senses a long time ago and walked back that heavy handedness, I'm surprised this just happened.
Your point reminded me that around the time when the pandemic first started, I saw a YouTube video on physics titled something like "Corona and Arc Discharge" and it had the contextual note that is sometimes added to videos. I think the official name YouTube gives it is: "topical context in information panel". I thought it was a funny case where the automated system thought this physics video had something to do with COVID.
Merriam Webster defines con man as "a person who tricks other people in order to get their money : con artist"
Even if people were straight up wrong about their COVID-19 theories, I don't think many of the banned people were trying to get viewers to send them money.
We both know that ads and sponsorships are a significant way influencers monetize their viewers.
All they have to do is lie to attract eyeballs and they make money. E-begging isn't necessary, the platforms allow you to extract value from viewers at an incredible scale.
First, let's dispense with the idea that anybody is a free speech absolutist. Nobody is. No site is. Not even 4chan is (ie CSAM is against 4chan ToS and is policed).
Second, some ideas just aren't worth distributing or debating. There's a refrain "there's no point debating a Nazi". What that means is there is a lot of lore involved with being a modern Nazi, a labyrinth of conspiracy theories. To effectively debate a Nazi means learning all that lore so you can dismantle it. There's no point. In reality, all you end up doing is platforming those ideas.
I'm actually shocked at how ostensibly educated people fall into the anti-vax conspiracy trap. Covid definitely made this worse but it existed well before then. Certain schools in San Francisco had some of the lowest child vaccination rates in the country.
As a reminder, the whole vaccine autism "theory" originated from one person: Andrew Wakefield. He was a doctor in the UK who was trying to sell a vaccine. The MMR vaccine was a direct compeititor so he just completely made up the MMR link to autism. He his medical license because of it. But of course he found a receptive audience in the US. He is and always was a complete charlatan.
Likewise, the Covid anti-vax movement was based on believing random Youtube videos from laymen and, in many cases, an intentional ignorance in the most esteemed traditions of American anti-intellectualism. People who are confidently wrong about provably wrong things who had no interest in educating themselves. Some were griters. Some were stupid. Many were both.
We had people who didn't understand what VAERS was. (and is). We had more than 10 million people die of Covid yet people considered the vaccine "dangerous" without any evidence of side effects let alone death. As one example, you had people yelling "J'accuse!" at hints of myocardial inflammation from the vaccine. But you know what else causes myocardial inflammation? Getting Covid.
If you're excited by this move, it just further highlights that you have no idea whta's going on and zero interest in the truth. What's happening here is big tech companies capitulating to the fringe political views of the administration, a clear First Amendment violation, to curry favor, get their mergers approved, get cgovernment contracts and so on.
Regardless of your views on this or any otehr issue you should care about capitulation by social media sites in this way.
This comments on this post are just a graveyard of sadness.
The problem with those "ideas that just aren't worth" is the usual, who decides?
In my country of origin, you get called a Nazi simply for being opposed to the war of aggression that it is currently engaged in. In US, we have a long history of "terrorist" and "extremist" being similarly abused.
Do you think it's a good idea that this administration gets to decide what is and isn't acceptable speech? That's one of my points. So regardless of your positions on Covid and the 2020 you shouldn't celebrate this move because the government shouldn't have this kind of influence.
> Google's move to reinstate previously banned channels comes just over a year after Meta CEO Mark Zuckerberg said [...] that the Biden administration had repeatedly pressured Meta in 2021 to remove content related to COVID-19. "I believe the government pressure was wrong, and I regret that we were not more outspoken about it," Zuckerberg wrote in the August 2024 letter.
I'm sure Zuckerberg will say the same thing in 2029 too if the ruling party changes again. Until then, removing fact-checking and letting conspiracy theorists have their freedom of speech while suppressing voices critical of the current administration will make that change less likely...
Any service which allows user generated content and allows arbitrary IP addresses to create infinite accounts is guaranteed to be overrun with CSAM. It's practically a law of physics.
If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life at the hands of actual authorities. Websites banning such posters only serves to alert them that they need to improve their tactics and give them the opportunity to hide. Removing only the offending content and alerting authorities is the appropriate thing a website like Youtube should be doing.
Even if one does argue that CSAM should result in hardware and IP bans, there's no reason that can't be a sole exception to a wider prohibition on such bans.
I could go on. Feel free if you want to see more. :)
(Was it misinformation when Fauci said you shouldn't rush a vaccine or all hell breaks loose years later? Or when he intimated that masks wouldn't work for covid?)
The reason why you are asked better source is because, and let me say this slowly, anyone can post any crap on the internet without repercussions. Lets start with the one that references "Sasha Latypova". If I search her credentials she earned a title on Master of Business Administration, except that she used that to work as a co-founder of two companies, and none of them are even adjacent to pharmacology, but she is a "global PHARMA regulation expert". I'm sure that the other persons there will not have those issues, right?
I'm very pro-vaccines, I don't think the 2020 election was stolen. But I think we have to realize silencing people doesn't work. It just causes the ideas to metastasize. A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.
The more important point (and this is really like a high school civics debate) is that the government and/or a big tech company shouldn't decide what people are "allowed" to say. There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think. People seem to forget that sometimes someone they don't agree with is in power. What if they started banning tylenol-autism sceptical accounts?
> the government and/or a big tech company shouldn't decide what people are "allowed" to say.
That "and/or" is doing a lot of work here. There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.
Then again, Alphabet is now claiming they did want to host it and mean old Biden pressured them into pulling it so if we buy that, maybe it doesn't matter.
> What if they started banning tylenol-autism sceptical accounts?
What if it's pro-cannibalism or pedophilia content? Everyone has a line, we're all just arguing about where exactly we think that line should be.
40 replies →
> the government and/or a big tech company shouldn't decide what people are "allowed" to say
This throws out spam and fraud filters, both of which are content-based moderation.
Nobody moderates anything isn’t unfortunately a functional option. Particularly if the company has to sell ads.
As with others, I think your "and/or" between government and "big tech" is problematic.
I think government censorship should be strictly prohibited. I think "company" censorship is just the application of the first amendment.
Where I think the problem lies with things like YouTube is the fact that we have _monopolies_, so there is no "free market" of platforms.
I think we should be addressing "big tech" censorship not by requiring tech companies to behave like a government, but rather by preventing any companies from having so much individual power that we _need_ them to behave like a government.
We should have aggressive anti-trust laws, and interoperability requirements for large platforms, such that it doesn't matter if YouTube decides to be censorious, because there are 15 other platforms that people can viably use instead.
Another way of articulating this: "concentrations of power and wealth should not determine the speech or political sentiments of the many."
My fear is that this is incredibly uncontroversial this is until it's not-- when pushes becomes shoves we start having debates about what are, "legitimate" concentrations of power (wealth) and how that legitimacy in itself lets us, "tolerate what we would generally condemn as intolerable." I feel we need to take a queue from the Chomsky's of the world and decree:
"all unjustified concentrations of power and wealth are necessarily interested in control and as such we should aggressively and purposefully refuse to tolerate them at all as a basic condition of democratic living..."
This used to be, "social democracy" where these days the Democratic Party in the United States' motto is more, "let us make deals with the devil because reasons and things." People have the power. We are the people. Hare fucking Krsna.
No one in Big Tech decides what you are allowed to say, they can only withhold their distribution of what you say.
As a book publisher, should I be required to publish your furry smut short stories? Of course not. Is that infringing on your freedom of speech? Of course not.
17 replies →
This is just a reminder that we're both posting on one the most heavily censored, big tech-sponsored spaces on the internet, and arguably, that's what allows for you to have your civics debate in earnest.
What you are arguing for is a dissolution of HN and sites like it.
Does Disney have a positive obligation to show animal cruelty snuff films on Disney Plus? Or are they allowed to control what people say on their network? Does Roblox have to allow XXX games showing non-consensual sex acts on their site, or are they allowed to control what people say on their network? Can WebMD decide not to present articles claiming that homeopathy is the ultimate cure-all? Does X have to share a "trending" topic about the refusal to release the Epstein files?
The reason we ban government censorship is so that a private actor can always create their own conspiracy theory + snuff film site if they want, and other platforms are not obligated to carry content they find objectionable. Get really into Rumble or Truth Social or X if you would like a very different perspective from Youtube's.
1 reply →
I have some ideas I want to post on your personal webpage but you have not given me access. Why are you censoring me?
6 replies →
> a big tech company shouldn't decide what people are "allowed" to say
On their platform, that’s exactly what they are entitled to do. When you type into the box in the Facebook app, that’s your speech. But unless the platform wants to add your contribution to their coherent speech product, they have every right to reject it.
Otherwise, the government is deciding what people can say, and you’d be against that, right?
Further, if I wanted to start a social media platform called thinkingtylenolcausesautismisstupid.com, wouldn’t restricting my right to craft my product defeat the whole point of my business?
Giving platforms the ability to moderate their output to craft a coherent speech product is the only reason we have multiple social networks with different rules, instead of one first-mover social network with no rules where everyone is locked in by network effects.
> There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think
But even that dumb stuff aside: there's two ways for a government to silence the truth: censorship, and propaganda.
We've got LLMs now, letting interested parties (government or not) overwhelm everyone with an endless barrage of the worst, cheapest, lowest quality AI slop, the kind that makes even AI proponents like me go "ah, I see what you mean about it being autocomplete", because even the worst of that by quality is still able to bury any bad news story just as effectively as any censorship. Too much noise and not enough signal, is already why I'm consuming far less YouTube these days, why I gave up on Twitter when it was still called that, etc.
And we have AI that's a lot better at holding a conversation than just the worst, cheapest, lowest quality AI slop. We've already seen LLMs are able to induce psychosis in some people just by talking to them, and that was, so far as we can tell, accidental. How long will it be before a developer chooses to do this on purpose, and towards a goal of their choice? Even if it's just those who are susceptible, there's a lot of people.
What's important is the freedom to share truth, no matter how uncomfortable, and especially when it's uncomfortable for those with power. Unfortunately, what we humans actually share the most is gossip, which is already a poor proxy for truth and is basically how all the witch hunts, genocides, and other moral-panic-induced horrors of history happened.
It is all a mess; it is all hard; don't mistake the proxy (free speech in general) for the territory (speak truth to power, I think?); censorship is simultaneously bad and the only word I know for any act which may block propaganda which is also bad.
My refusing to distribute your work is not "silencing." Silencing would be me preventing you from distributing it.
Have we all lost the ability to reason? Seriously, this isn't hard. No one owes you distribution unless you have a contract saying otherwise.
It's not that simple. For example, when libraries remove books for political reasons they often claim it isn't "censorship" because you could buy the book at a bookstore if you wanted. But if it really would have no effect on availability they wouldn't bother to remove the book, would they?
3 replies →
Not OP, but we did learn the US federal government was instructing social media sites like Twitter to remove content it found displeasing. This is known as jawboning and is against the law.
SCOTUS. Bantam Books, Inc. v. Sullivan, holds that governments cannot coerce private entities into censoring speech they disfavor, even if they do not issue direct legal orders.
This was a publicly announced motivation for Elon Musk buying Twitter. Because of which we know the extent of this illegal behavior.
Mark Zuckerberg has also publicly stated Meta was asked to remove content by the US government.
11 replies →
I think the feeling of silencing comes from it being a blacklist and not a whitelist.
If you take proposals from whoever and then only approve ones you specifically like, for whatever reason, then I don’t think anyone would feel silenced by that.
If you take anything from anyone, and a huge volume of it, on any topic and you don’t care what, except for a few politically controversial areas, that feels more like silencing. Especially when there is no alternative service available due to network effects and subsidies from arguably monopolistic practices.
1 reply →
I'd certainly consider an ISP refusing to route my packets as silencing. is YouTube so different? legally, sure, but practically?
3 replies →
At some level these platforms are the public square and facilitate public discussion. In fact, Google has explicitly deprioritized public forum sites (e.g. PHPbb) in preference to forums like YouTube. Surely there is a difference between declining to host and distribute adult material and enforcing a preferred viewpoint on a current topic.
Sure, Google doesn't need to host anything they don't want to; make it all Nazi apologia if they thing it serves their shareholders. But doing so and silencing all other viewpoints in that particular medium is surely not a net benefit for society, independent of how it affects Google.
2 replies →
> My refusing to distribute your work is not "silencing."
That distinction is a relic of a world of truly public spaces used for communication— a literal town square. Then it became the malls and shopping centers, then the Internet— which runs on private pipes— and now it’s technological walled gardens. Being excluded from a walled garden now is effectively being “silenced” the same way being excluded from the town square was when whatever case law you’re thinking was decided.
> No one owes you distribution unless you have a contract saying otherwise.
The common carrier law says you have to for for some things, so it makes sense to institute such a law for some parts of social media as they are fundamental enough. It is insane that we give that much censorship power to private corporations. They shouldn't have the power to decide elections on a whim etc.
10 replies →
It's interesting how much "they are a private company, they can do what they want" was the talking point around that time. And then Musk bought Twitter and people accuse him of using it to swing the election or whatever.
Even today, I was listening to NPR talk about the potential TikTok deal and the commenter was wringing their hands about having a "rich guy" like Larry Ellison control the content.
I don't know exactly what the right answer is. But given their reach -- and the fact that a lot of these companies are near monopolies -- I think we should at least do more than just shrug and say, "they can do what they want."
1 reply →
If you refuse to distribute some information you are making editorial decision. Clearly you are reviewing all of the content. So you should be fully liable for all content that remains. Including things like libel or copyright violation.
To me that sounds only fair trade. You editorialize content. You are liable for all content. In every possible way.
Jimmy Kimmel wasn't being silenced. He doesn't have a right to a late night talk show. Disney is free to end that agreement within the bounds of their contract. Being fired for social media posts isn't being silenced. Employment is for the most part at will. Getting deported for protesting the Gaza war isn't being silenced. Visas come with limitations, and the US government has the authority to revoke your visa if you break those rules. /s
You seem to think there's a bright line of "silenced" vs "not silenced". In reality there's many ways of limiting and restricting people's expressions. Some are generally considered acceptable and some are not. When huge swaths of communication are controlled by a handful of companies, their decisions have a huge impact on what speech gets suppressed. We should interrogate whether that serves the public interest.
7 replies →
[dead]
So you're saying that YouTube is a publisher and should not have section 230 protections? They can't have it both ways. Sure remove content that violates policies but YouTube has long set itself up as an opinion police force, choosing which ideas can be published and monetized and which cannot.
4 replies →
I agree. People today are far more anti-vaccine than they were a few years ago which is kinda crazy when you consider we went through a global pandemic where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.
I think if public health bodies just laid out the data they had honestly (good and bad) and said that they think most people should probably take it, but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.
And the attempts at censorship have played a part in people drifting towards being more vaccine-hesitant or anti-vaccine.
It's often a lot better to just let kooks speak freely.
6 replies →
That didn't happen in a vacuum; there was also a _lot_ of money going into pushing anti vaccine propaganda, both for mundane scam reasons and for political reasons: https://x.com/robert_zubrin/status/1863572439084699918?lang=...
>where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.
The only reason you believe that is because all information to the contrary was systematically censored and removed from the media you consume. The actual data doesn't support that, there are even cases where it increased mortality, like https://pmc.ncbi.nlm.nih.gov/articles/PMC11278956/ and increased the chance of future covid infections, like https://pubmed.ncbi.nlm.nih.gov/39803093/ .
10 replies →
> but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.
Nah, the same grifters who stand to make a political profit of turning everything into a wedge issue would have still hammered right into it. They've completely taken over public discourse on a wide range of subjects, that go well beyond COVID vaccines.
As long as you can make a dollar by telling people that their (and your) ignorance is worth just as much - or more - than someone else's knowledge, you'll find no shortage of listeners for your sermon. And that popularity will build its own social proof. (Millions of fools can't all be wrong, after all.)
2 replies →
It's more that people in general* connect to personal stories far more than impersonal factual data. It's easy to connect to seeing people say they had adverse reactions to a vaccine than statistical data showing it's safer to get vaccinated than not. It's also easier to believe conspiracies, its easier to think bad things happen due to the intent of bad people, than the world being a complex hard to understand place with no intent behind things happening.
These are just things that some of the population will be more attracted to, I don't think it has anything to do with censorship, lockdowns, or mandates. At most the blame can be at institutions for lacking in their ability to do effective scientific communication.
*And this skews more to less educated and intelligent.
Issue is when we weren't/aren't even allowed to question the efficacy or long-term side effects of any vaccine.
> one of the only things that actually worked to stop people dying was the roll out of effective vaccines
"A total of 913 participants were included in the final analysis. The adjusted ORs for COVID-19 infection among vaccinated individuals compared to unvaccinated individuals were 1.85 (95% CI: 1.33-2.57, p < 0.001). The odds of contracting COVID-19 increased with the number of vaccine doses: one to two doses (OR: 1.63, 95% CI: 1.08-2.46, p = 0.020), three to four doses (OR: 2.04, 95% CI: 1.35-3.08, p = 0.001), and five to seven doses (OR: 2.21, 95% CI: 1.07-4.56, p = 0.033)." - ["Behavioral and Health Outcomes of mRNA COVID-19 Vaccination: A Case-Control Study in Japanese Small and Medium-Sized Enterprises" (2024)](https://www.cureus.com/articles/313843-behavioral-and-health...)
"the bivalent-vaccinated group had a slightly but statistically significantly higher infection rate than the unvaccinated group in the statewide category and the age ≥50 years category" - ["COVID-19 Infection Rates in Vaccinated and Unvaccinated Inmates: A Retrospective Cohort Study" (2023)](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10482361/)
"The risk of COVID-19 also varied by the number of COVID-19 vaccine doses previously received. The higher the number of vaccines previously received, the higher the risk of contracting COVID-19" - ["Effectiveness of the Coronavirus Disease 2019 (COVID-19) Bivalent Vaccine" (2022)](https://www.medrxiv.org/content/10.1101/2022.12.17.22283625v...)
"Confirmed infection rates increased according to time elapsed since the last immunity-conferring event in all cohorts. For unvaccinated previously infected individuals they increased from 10.5 per 100,000 risk-days for those previously infected 4-6 months ago to 30.2 for those previously infected over a year ago. For individuals receiving a single dose following prior infection they increased from 3.7 per 100,000 person days among those vaccinated in the past two months to 11.6 for those vaccinated over 6 months ago. For vaccinated previously uninfected individuals the rate per 100,000 person days increased from 21.1 for persons vaccinated within the first two months to 88.9 for those vaccinated more than 6 months ago." - ["Protection and waning of natural and hybrid COVID-19 immunity" (2021)](https://www.medrxiv.org/content/10.1101/2021.12.04.21267114v...)
[flagged]
20 replies →
Yes! This MUST be why the VAERS adverse event tracker went through the roof right after the rollout began, and why excess death remains sky high in many countries to this day - because a product that didn't stop you from catching or spreading the virus was one of the only things preventing deaths. Couldn't have been our, you know, immune system or anything like that, or that the average age at death was 80 along with several co-morbidities.
I feel like we're living in different worlds, because from what I've seen, giving people platforms clearly doesn't work either. It just lets the most stupid and incendiary ideas to spread unchecked.
If you allow crazy people to "let it ride" then they don't stop until... until... hell we're still in the middle of it and I don't even know when or if they will stop.
Spot on. At least in the UK, anyone who thinks fake news will just "fizzle out" on social media hasn't been paying attention to the increasing frenzy being whipped up by the alt right on Twitter and Telegram, and consequences like the Southport riots.
It's poorly thought out logic. Everyone sees how messy and how mistakes can be made when attempting to get to a truth backed by data + science, so they somehow they conclude that allowing misinformation to flourish will solve the problem instead of leading to a slow decline of morality/civilization.
Very analogous to people who don't like how inefficient governments function and somehow conclude that the solution is to put people in power with zero experience managing government.
4 replies →
I wonder how much of that is giving a platform to conspiracy theorists and how much of it is the social media algorithms' manipulation making the conspiracy theories significantly more visible and persuasive.
2 replies →
Perhaps free speech isn't the problem, but free speech x algorithmic feeds is? As we all know the algorithm favors the dramatic, controversial, etc. That creates an uneven marketplace for free speech where the most subversive and contrarian takes essentially have a megaphone over everyone else.
As I understand it, Twitter has something called Community Notes. So people can write things, but it can potentially have an attached refutation.
4 replies →
Glad to see this, was going to make a similar comment.
People should be free to say what they want online. But going down "YouTube conspiracy theory" rabbit holes is a real thing, and YouTube doesn't need to make that any easier, or recommend extreme (or demonstrably false) content because it leads to more "engagement".
1 reply →
Building on that, the crazy person spouting conspiracy theories in the town square, who would have been largely ignored in the past, suddenly becomes the most visible.
The first amendment was written in the 1700s...
I feel that this is the right approach-- the liability and toxicity of the platforms isn't due to them being communication platforms it's because in most practical or technical ways they are not: they are deliberate behavior modification schemes where-in companies are willfully inflaming their customer's political and social sentiments for profit in exchange for access to the addictive platform. It's like free digital weed but the catch is that it makes you angry and politically divisive.
In this sense platforms like X need to be regulated more like gambling. In some ways X is a big roulette wheel that's being spun which will help stochastically determine where the next major school shooting will take place.
2 replies →
I think it made sense as a tactical choice at the moment, just like censorship during wartime - I dont think it should go on forever, because doing so is incompatible with a free society.
It didn't even make sense at the time. It tainted everything under a cloud that the official, accepted truth needed to suppress alternatives to win the battle of minds. It was disastrous, and it is astonishing seeing people (not you, but in these comments) still trying to paint it as a good choice.
It massively amplified the nuts. It brought it to the mainstream.
I'm a bit amazed seeing people still justifying it after all we've learned.
COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.
And to state my position like the root guy, I'm a progressive, pro-vaccine, medical science believer. I listen to my doctor and am skeptical if not dismissive of the YouTube "wellness" grifters selling scam supplements. I believe in science and research. I thought the worm pill people were sad if not pathetic. Anyone who gets triggered by someone wearing a mask needs to reassess their entire life.
But lockdowns went on way too long. Limits on behaviour went on way too long. Vaccine compliance measures were destructive the moment we knew it had a negligible effect on spread. When platforms of "good intentions" people started silencing the imbeciles, it handed them a megaphone and made the problem much worse.
And now we're living in the consequences. Where we have a worm-addled halfwit directed medicine for his child-rapist pal.
5 replies →
[dead]
IMO free speech requires moderation, but the "how" is an unsolved problem. In a completely unmoderated environment, free speech will be drowned out by propaganda from your adversaries. The decades of experience and the industrial scale that Russian (or similar) troll factories can manufacture grassroots content or fund influencers is not something that can be combated at an individual level.
It would be a mistake to think such operations care too much about specific talking points, the goal is to drown out moderate discussion to replace it with flamewars. It's a numbers game, so they'll push in hundreds of different directions until they find something that sticks and also both sides of the same conflict.
The problem is the algorithm.
Content that makes people angry (extreme views) brings views.
Algorithims optimise for views -> people get recommended extreme views.
You can test this with a fresh account, it doesn't take many swipes on Youtube Shorts to get some pretty heinous shit if you pretend to be a young male to the algorithm.
The best disinfectant is sunlight. I'm similarly appalled by some of the behaviour after a certain political activist was murdered, but I don't want them to get banned or deplatformed. I'm hoping what we're seeing here is a restoration of the ability to disagree with each other
>The best disinfectant is sunlight.
Is it? How does that work at scale?
Speech generally hasn’t been restricted broadly. The same concepts and ideas removed from YouTube still were available on many places (including here).
Yet we still have so many people believing falsehoods and outright lies. Even on this very topic of COVID, both sides present their “evidence” and and truly believe they are right, no matter what the other person says.
3 replies →
> The best disinfectant is sunlight
Have you actually tried to shine sunlight on online misinformation? If you do you will quickly find it doesn't really work.
The problem is simple. It is slower to produce factually correct content. A lot slower. And when you do produce something the people producing the misinformation can quickly change their arguments.
Also, by the time you get your argument out many of the people who saw the piece you are refuting and believed it won't even see your argument. They've moved on to other topics and aren't going to revisit that old one unless it is a topic they are particularly interested in. A large number will have noted the original misinformation, such as some totally unsafe quack cure for some illness that they don't currently have, accepted it as true, and then if they ever find themselves with that illness apply the quack cure without any further thought.
The debunkers used to have a chance. The scammers and bullshitters always had the speed advantage when it came to producing content but widespread distribution used to be slow and expensive. If say a quack medical cure was spreading the mainstream press could ask the CDC or FDA about it, talk to researchers, and talk to doctors dealing with people showing up in emergency rooms from trying the quack cure, and they had the distribution networks to spread this information out much faster than the scammers and bullshitters.
Now everyone has fast and cheap distribution through social media, and a large number of people only get their information from social media and so the bullshitters and scammers now have all the advantages.
And not letting the disease spread to begin with is better than any disinfectant.
>> The best disinfectant is sunlight.
Trump thought so too.
How's that working out? The worst ideas of the 20th century are resurfacing in plain sunlight because the dem's couldn't pluck their heads out of the sand and actually fight them.
Now vaccines are getting banned and the GOP is gerrymandering the hell out of the country to ensure the end of the democratic process. Sure, let's do nothing and see where that brings us. Maybe people will magically come to their senses.
1 reply →
I agree and I’m pro vaccines but want the choice on if/when to vaccinate my kids. I believe there were election discrepancies but not sure if it was stolen. I felt the ZeroHedge article about lab leak was a reasonable possibility. All these things were shutdown by the powers that be (and this was not Trump’s fault). The people shutting down discourse are the problem.
You pretty much have the choice about vaccinating your kids. You might not be able to send them to public school without vaccinations though, depending on your local laws.
1 reply →
I think there's a difference between silencing people, and having an algorithm that railroads people down a polarization hole.
My biggest problem with YouTube isn't what it does/doesn't allow on its platform, it's that it will happily feed you a psychotic worldview if it keeps you on the site. I've had several family members go full conspiracy nut-job after engaging heavily with YouTube content.
I don't know what the answer is. I think many people would rightly argue that removing misinformation from the recommendation engine is synonymous with banning it. FWIW I'd be happy if recommendation engines generally were banned for being a societal cancer, but I'm probably in the minority here.
They don't silence people to stop narratives. People are silenced to cause divisions and to exert control over the population. When people stop using tech they don't control and supporting people or systems that do not have their best interests a heart, only then will we see reach change.
There is no conspiracy. It’s all emergent behavior by large groups of uncoordinated dunces who can’t keep even the most basic of secrets.
1 reply →
I used to think like you, believing that, on average, society would expurge the craziness, but the last decade and the effect of social media and the echo chambers in groups made me see that I was completely wrong.
It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.
If 'silencing people' doesn't work- so online platforms aren't allowed to remove anything? Is there any limit to this philosophy? So you think platforms can't remove:
Holocaust denial? Clothed underage content? Reddit banned r/jailbait, but you think that's impermissible? How about clothed pictures of toddlers but presented in a sexual context? It would be 'silencing' if a platform wanted to remove that from their private property? Bomb or weapons-making tutorials? Dangerous fads that idiotic kids pass around on TikTok, like the blackout game? You're saying it's not permissible for a platform to remove dangerous instructionals specifically targeted at children? How about spam? Commercial advertising is legally speech in the US. Platforms can't remove the gigantic quantities of spam they suffer from every day?
Where's the limiting principle here? Why don't we just allow companies to set their own rules on their own private property, wouldn't that be a lot simpler?
I used to believe this. But I feel more and more we need to promote a culture of free speech that goes beyond the literal first amendment. We have to tolerate weird and dangerous ideas.
3 replies →
> It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.
1) They are public corporations and are legal creation of the state and benefit from certain protections of the state. They also have privileged access to some public infrastructures that other private companies do not have.
2) By acting on the behest of the government they were agent of the government for free speech and censorship purposes
3) Being monopolies in their respective markets, this means they must respect certain obligations the same way public utilities have.
3 replies →
Read the article, along with this one https://reclaimthenet.org/google-admits-biden-white-house-pr...
In this case it wasn't a purely private decision.
"Where's the limiting principle here?"
How about "If the content isn't illegal, then the government shouldn't pressure private companies to censor/filter/ban ideas/speech"?
And yes, this should apply to everything from criticizing vaccines, denying election results, being woke, being not woke, or making fun of the President on a talk show.
Not saying every platform needs to become like 4chan, but if one wants to be, the feds shouldn't interfere.
>A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.
Except many people don't roll their eyes at it, that's exactly the problem. QAnon went from a meme on 4chan to the dominant political movement across the US and Europe. Anti-vax went from fringe to the official policy position of the American government. Every single conspiracy theory that I'm aware of has only become more mainstream, while trust in any "mainstream" source of truth has gone down. All all of this in an environment of aggressive skepticism, arguing, debating and debunking. All of the sunlight is not disinfecting anything.
We're literally seeing the result of the firehose of misinformation and right-wing speech eating people's brains and you're saying we just have to "let it ride?"
Silencing people alone doesn't work, but limiting the damage misinformation and hate speech can do while pushing back against it does work. We absolutely do need to preserve the right of platforms to choose what speech they spread and what they don't.
> But I think we have to realize silencing people doesn't work
It actually does work. You need to remove ways for misinformation to spread, and suppressing a couple of big agents works very well.
- https://www.nature.com/articles/s41586-024-07524-8 - https://www.tandfonline.com/doi/full/10.1080/1369118X.2021.1... - https://dl.acm.org/doi/abs/10.1145/3479525 - https://arxiv.org/pdf/2212.11864
Obviously, the best solution would be prevention, by having good education systems and arming common people with the weapons to assess and criticize information, but we are kinda weak on that front.
> But I think we have to realize silencing people doesn't work.
Doesn't it though? I've seen this repeated like it's fact but I don't think that's true. If you disallowed all of some random chosen conspiracy off of YouTube and other mainstream platforms I think it would stop being part of the larger public consciousness pretty quickly.
Many of these things arrived out of nothing and can disappear just as easily.
It's basic human nature that simply hearing things repeated over and over embeds it into your consciousness. If you're not careful and aware of what you're consuming then that becomes a part of your world view. The most effective way to bring people back from conspiratorial thinking (like QAnon) is to unplug them from that source of information.
The issue for me is that kids are on YouTube, and I think there should be some degree of moderation.
> But I think we have to realize silencing people doesn't work.
We also tried letting the propaganda machine full-blast those lies on the telly for the past 5 years.
For some reason, that didn't work either.
What is going to work? And what is your plan for getting us to that point?
Algorithmic Accountability.
People can post all sorts of crazy stuff, but the algorithms do not need to promote it.
Countries can require Algorithmic Impact Assements and set standards of compliance to recommended guidelines.
1 reply →
These policies were put in place because the anti-vax and election skepticism content was being promoted by military intelligence organizations that were trying to undermine democracy and public healthy in the US.
The US military also promoted anti-vax propaganda in the Philippines [0].
A lot of the comments here raise good points about silencing well meaning people expressing their opinion.
But information warfare is a fundamental part of modern warfare. And it's effective.
An American company or individual committing fraud can be dealt with in the court system. But we don't yet have a good remedy for how to deal with a military power flooding social media with information intentionally designed to mislead and cause harm for people who take it seriously.
So
> I think we have to realize silencing people doesn't work
it seems to have been reasonably effective at combating disinformation networks
> It just causes the ideas to metastasize
I don't think this is generally true. If you look at old disinformation campaigns like the idea that the US faked the moon landings, it's mostly confined to a small group of people who are prone to conspiracy thinking. The idea of a disinformation campaign is you make it appear like a crazy idea has broad support. Making it appear that way requires fake accounts or at least boosters who are in on the scheme. Taking that away means the ideas compete on their own merit, and the ideas are typically real stinkers.
[0] https://www.btimesonline.com/articles/167919/20240727/u-s-ad...
I think you are granting false neutrality to this speech. These misinfo folks are always selling a cure to go with their rejection of medicine. It's a billion dollar industry built off of spreading fear and ignorance, and youtube doesn't have any obligation to host their content. As an example, for 'curing' autism, the new grift is reject Tylenol and buy my folic acid supplement to 'fix' your child. Their stores are already open and ready.
To finish the thought, scientists at the CDC (in the before times) were not making money off of their recommendations, nor were they making youtube videos as a part of their day job. There's a deep asymmetry here that's difficult to balance if you assume the premise that 'youtube must accept every kind of video no matter what, people will sort themselves out'. Reader, they will not.
And silencing these people only lends credence to their "they don't want you to know this" conspiracy theories. Because at that point it's not a theory, it's a proven fact.
3 replies →
It works 99% of the time and you are overindexing on the 1% of the time it doesn’t to draw your conclusion.
Silencing people is the only thing that works is what I’ve learned on the internet.
Yes we should be allowed to bully idiots into the ground.
no, letting misinformation persist is counterproductive because of the illusory truth effect. the more people hear it, the more they think (consciously or not) "there must be something to this if it keeps popping up"
Elon Musk's takeover of X is already a good example of what happens with unlimited free speech and unlimited reach.
Neo-nazis and white nationalists went from their 3-4 replies per thread forums, 4chan posts, and Telegram channels, to now regularly reaching millions of people and getting tens of thousands of likes.
As a Danish person I remember how American media in the 2010s and early 2020s used to shame Denmark for being very right-wing on immigration. The average US immigration politics thread on X is worse than anything I have ever seen in Danish political discussions.
I wish someone could have seen the eye roll I just performed reading this comment.
Silencing absolutely works! How do you think disinformation metastasized!?
Funny thing, several person that counter responded and disagreed got grayed out (aka negative downvoted ... as in censored).
Reality is, i have personally seen what this type of uncontrolled anti-vax stuff does. The issue is that its harder to disprove a negative with a positive, then people realize.
The moment you are into the youtube, tiktok or whatever platform algorithm, you are fed a steady diet of this misinformation. When you then try to argue with actual factual studies, you get the typical response from "they already said that those studies are made up"... How do you fight that? Propaganda works by flooding the news and over time, people believe it.
That is the result of uncensored access because most people do not have the time to really look up a scientific study. The amount of negative channels massive out way positive / fact based channels because the later is "boring". Its the same reason why your evening news is 80% deaths, corruptions, thefts, politicians and taxes or other negative world news. Because it has been proven that people take in negative news much more. Clickbait titles that are negative draw in people.
There is a reason why holocaust denial is illegal in countries. Because the longer some people can spew that, the more people actually start to believe it.
Yes, i am going to get roasted for this but people are easily influenced and they are not as smart as they think themselves are. We have platforms that cater to people's short attention span with barely 1~3 min clips. Youtube video's longer then 30min are horrible for the youtubers income as people simply do not have the attention span and resulting lost income.
Why do we have laws like seatbelt, speed limits, and other "control" over people. Because people left to their own devices, can be extreme uncaring about their own family, others, even themselves.
Do i like the idea of censorship for the greater good, no. But when there are so many that spew nonsense just to sell their powders, and their homemade vitamine C solutions (made in China)... telling people information that may hurt or kills themselves, family or others.
Where is the line of that unbridled free speech? Silencing people works as in, your delaying the flow of shit running down a creek, will it stop completely? No, but the delay helps people downstream. Letting it run uninterrupted, hoping that a few people downstream with a mop will do all the work, yea ...
We only need to look at platforms like X, when "censorship" got removed (moderation). Full of free speech, no limits and it turned into a soak pit extreme fast (well, bigger soak pit).
Not sure why i am writing this because this is a heated topic but all i can say is, I have seen the damage that anti-vax did on my family. And even to this day, that damage is still present. How a person who never had a issue with vaccinations, never had a bad reaction beyond the sour arm for a day, turned so skeptical to everything vaccination. All because those anti-vax channels got to her.
The anti-vax movement killed people. There is scientific study upon study how red states in the US ended up with bigger amounts of deaths given the time periodes. And yet, not a single person was ever charged for this, ... all simply accepted this and never looked back. Like it was a natural thing, that people's grandparents, family members died that did not need to die.
The fact that people have given up, and now accept to let those with often financial interests, spew nonsense as much as they like. Well, its "normal".
I weep for the human race because we are not going to make it.
> silencing people doesn't work
I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?
Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?
Slow down our algorithmic hell hole. Particularly around elections.
10 replies →
Have you heard about Tik Tok? And you think governments' intelligence agencies are not inserting their agents in key positions at bit tech companies?
Censorship is a tool to combat misinformation.
It's taking a sword to the surgery room where no scalpel has been invented yet.
We need better tools to combat dis/mis-information.
I wish I knew what that tool was.
Maybe 'inoculating information' that's specifically stickier than the dis/mis-info?
4 replies →
You are on a platform that polices speech. It is evidence that policing speech helps establish civility and culture. There's nothing wrong with policing speech, but it can certainly be abused.
If you were on the early Internet, you were self policing with the help of admins all the time. The difference was you had niche populations that had a stake in keeping the peace and culture of a given board
We broke those boundaries down though and now pit strangers versus strangers for clicks and views, resulting in daily stochastic terrorism.
Police the damn speech.
For inciting violence. Sure. Free speech isn’t absolute.
But along with fringe Covid ideas, we limited actual speech on legitimate areas of public discourse around Covid. Like school reopening or questioning masks and social distancing.
We needed those debates. Because the unchecked “trust the experts” makes the experts dumber. The experts need to respond to challenges.
(And I believe those experts actually did about as best they could given the circumstances)
21 replies →
> Police the damn speech.
What happens when the “police” disagrees with and silences what you believe is true? Or when they allow the propagation of what you believe to be lies?
Who gets to decide what’s the truth vs. lies? The “police”?
1 reply →
Policing speech for civility or spam is very different than policing speech for content that you disagree with. I was on the early internet, and on the vast majority of forums policing someone's speech for content rather than vulgarity or spam was almost universally opposed and frowned upon.
Depends who is doing the policing. In this case, White House was telling Google who to ban.
3 replies →
You've missed the point entirely.
It’s not if Google can decide what content they want on YouTube.
The issue here is that the Biden Whitehouse was pressuring private companies to remove speech that they otherwise would host.
That's a clear violation of the first amendment. And we now know that the previous Whitehouse got people banned from all the major platforms: Twitter, YouTube, Facebook, etc.
12 replies →
[flagged]
2 replies →
[dead]
[flagged]
[flagged]
[flagged]
No. This perspective is wrong in both directions: (1) it is bad medicine and, (2) the medicine doesn't treat the disease. If we could successfully ban bad ideas (assuming that "we" could agree on what they are) then perhaps we should. If the damage incurred by the banning of ideas were sufficiently small, perhaps we should. But both of these are false. Banning does not work. And it brings harm. Note that the keepers of "correct speech" doing the banning today (eg in Biden's day) can quickly become the ones being banned another day (eg Trump's). It's true that drowning the truth through volume is a severe problem, especially in a populace that doesn't care to seek out truth, to find needles in haystacks. But again, banning doesn't resolve this problem. The real solution is develop a populace that cares about, seeks out, and with some skill identifies the truth. That may not be an achievable solution, and in the best case it's not going to happen quickly. But it is the only solution. All of the supply-based solutions (controlling speech itself, rather than training good listeners) run afoul of this same problem, that you cannot really limit the supply, and to the extent you can, so can your opponents.
3 replies →
Can we stop with the Nazi stuff. I don't know if they stopped teaching history, but there is nothing happening in the US that is within an order of magnitude of the evil the Nazi's perpetrated. Being anti-vax is not comparable to genocide.
8 replies →
How can you say that banning Nazis has worked well considering everything so far this year?
8 replies →
[flagged]
2 replies →
The government created this problem when they enacted Section 230. This is at the root of the misinformation and disinformation... social media companies are not responsible for the harm.
The simple solution is repeal Section 230. When information can be transmitted instantly on a massive scale, somebody need to responsible for the information. The government should not police information but citizens should be allowed to sue social media companies for the harm caused to them.
The practical end result of repealing Section 230 is that companies will crack down on any even remotely controversial speech because that's the only way to avoid lawsuits.
1 reply →
It also turns into a talking point for them. A lot of these weird conspiracies would have naturally died out if some people didn’t try to shut them down so much.
For that matter why is it even such a crazy wild idea for anybody to dare to question medicines and motives from pharmaceutical companies? Or question elections?
Both have always been massively shady. I'm old enough to remember the big stink around the Al Gore election loss, or the robust questioning of the 2016 election for that matter. So ridiculous for self-proclaimed defenders of democracy to want to ban the discussion and disagreement about the facts around elections. Democratic processes and institutions should be open to doubt, questioning, and discussion.
The response to covid vaccines was actually extremely rational. They were highly taken up by the elderly who were shown to have the greatest risk, despite that demographic skewing more conservative (and arguably could be most at risk of "misinformation" from social media). And they were not able to stop transmission or provide much benefit to children and younger people, so they didn't get taken up so much among those groups. So there was really no need for this massive sustained psychological campaign of fearmongering, divisiveness, censorship, and mandates. They could have just presented the data and the facts as they came to hand, and be done with it.
With medicine there's pushback because the vast majority of the time, someone's scamming you and you likely don't actually know what you're talking about, we had a ton of this during covid, radioactive jewelery that was supposed to protect you, cow piss (I personally know people who tried this...), 5G towers (actual damage done to all sorts of towers), Ivermectin, Hydrochloroquine and more. People who are sick or have a sick loved one are especially vulnerable to these sorts of things (there's an example of such a victim in the comments), and often end up making things worse by waiting too long or causing further damage.
With questioning elections, I think Jan 6 would be a pretty good indication of why it wasn't appropriate? This wasn't how questioning the results of elections goes in democracies. Instead, even after courts had investigated, the outgoing president refused to accept the result without any substantiated evidence.
3 replies →
> From President Biden on down, administration officials “created a political atmosphere that sought to influence the actions of platforms based on their concerns regarding misinformation,” Alphabet said, claiming it “has consistently fought against those efforts on First Amendment grounds.”
This actually surprised me because I thought (and maybe still think) that it was Google employees that led the charge on this one.
It's in their interests now to throw Biden under the bus. There may be truth to this, but I'm sure its exaggerated for effect.
Worth noting that Trump directly threatened to put Zuckerberg in prison for life in relation to this: https://www.cnn.com/2024/08/31/politics/video/smr-trump-zuck...
I wouldn't trust any public statement from these companies once that kind of threat has been thrown around. People don't exactly want to go to prison forever.
It was. At the time, they felt like they were doing the right thing -- the heroic thing, even -- in keeping dangerous disinformation away from the public view. They weren't shy about their position that censorship in that case was good and necessary. Not the ones who said it on TV, and not the ones who said it to me across the dinner table.
For Google now to pretend Biden twisted their arm is pretty rich. They'd better have a verifiable paper trail to prove that, if they expect anyone with a memory five years long to believe it.
To be fair, even if they were being honest about Biden twisting their arm (I don't buy it), the timing makes it impossible to believe their claim.
16 replies →
They don't need a paper trail. Conservatives will believe anything damning they see about liberals. Just vague accusations or outright lies work plenty well to keep conservatives foaming at the mouth over imagined issues.
It's been known for years that the White House was pressuring Google on this. One court ordered them to cease temporarily. I wanted to link the article, but it's hard to find because of the breaking news.
At the time, YouTube said: “Anything that would go against World Health Organization recommendations would be a violation of our policy.” [1] which, in my opinion, is a pretty extreme stance to take, especially considering that the WHO contradicted itself many times during the pandemic.
[1] https://www.bbc.com/news/technology-52388586
> the WHO contradicted itself many times during the pandemic
Did they? I remember them revising their guidance, which seems like something one would expect during an emerging crisis, but I don't remember them directly contradicting themselves.
As super low hanging fruit:
June 8, 2020: WHO: Data suggests it's "very rare" for coronavirus to spread through asymptomatics [0]
June 9, 2020: WHO expert backtracks after saying asymptomatic transmission 'very rare' [1]
0: https://www.axios.com/2020/06/08/who-coronavirus-asymptomati... 1: https://www.theguardian.com/world/2020/jun/09/who-expert-bac...
Of course, if we just take the most recent thing they said as "revised guidance", I guess it's impossible for them to contradict themselves. Just rapidly re-re-re-revised guidance.
42 replies →
> Did they?
They said it was a fact that COVID is NOT airborne. (It is.)
Not they believed it wasn't airborne.
Not that data was early but indicated it wasn't airborne.
That it was fact.
In fact, they published fact checks on social media asserting that position. Here is one example on the official WHO Facebook page:
https://www.facebook.com/WHO/posts/3019704278074935/?locale=...
1 reply →
Some WHO reports were suggesting that lockdowns do more harm than good as early as late 2020.
Don't forget that they ban-hammered anyone who advanced the lab leak theory because a global entity was pulling the strings at the WHO. I first heard about Wuhan in January of 2020 from multiple Chinese nationals who were talking about the leak story they were seeing in uncensored Chinese media and adamant that the state media story was BS. As soon as it blew up by March, Western media was manipulated into playing the bigotry angle to suppress any discussion of what may have happened.
I believe having Trump as president exacerbated many, many things during that time, and this is one example. He was quick to start blaming the "Chinese", he tried to turn it into a reason to dislike China and Chinese people, because he doesn't like China, and he's always thinking in terms of who he likes and dislikes. This made it hard to talk about the lab leak hypothesis without sounding like you were following Trump in that. If we had had a more normal president, I don't think this and other issues would have been as polarized, and taking nuanced stances would have been more affordable.
My memory is that the "lab leak" stuff I saw back then was all conspiracy theories about how it was a Chinese bioweapon.
Eventually I started seeing some serious discussion about how it might have been accidentally created through gain of function research.
3 replies →
I called this out in this thread and was immediately downvoted
> because a global entity was pulling the strings at the WHO'
excuse me I'm sorry what?
Because that is a bold claim to make. There is no proof of a lab leak and evidence leads to the wet market as the source. There is a debate out there for 100k to prove this. Check it out.
17 replies →
The united states also said not to buy masks and that they were ineffective during the pandemic.
Placing absolute trust in these organizations and restricting freedom of speech based on that is a very bootlicking, anti-freedom stance
Fauci was trying to prevent a run on masks, which he believed were needed by the health care workers. So he probably justified his lie to the US to himself because it was for the "greater good" (The ends justify the means is not my view BTW).
It turns out that masks ARE largely ineffective at preventing CoViD infection. It's amazing how many studies have come up with vastly different results.
https://egc.yale.edu/research/largest-study-masks-and-covid-...
(Before you tell me that the story I cited above says the opposite, look at the effectiveness percentages they claim for each case.)
There's also this: https://x.com/RandPaul/status/1970565993169588579
7 replies →
Yeah they burned a lot of trust with that, for sure.
3 replies →
I think the problem is that apparently some people discovered there is a profitable business model in spreading misinformation, so a trustful (even if not always right), non malicious, reference of information might be needed.
But who watches the watchmen?
it was an extreme time, but yes, probably the most authoritarian action I've seen social media take.
misinformation is a real and worsening problem, but censorship makes conspiracies flourish, and establishes platforms as arbiters of truth. that "truth" will shift with the political tides.
IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons.
This just seems incredibly difficult. Even between people who are highly intelligent, educated, and consider themselves to be critical thinkers there can be a huge divergence of what "truth" is on many topics. Most people have no tools to evaluate various claims and it's not something you can just "teach kids". Not saying education can't move the needle but the forces we're fighting need a lot more than that.
I think some accountability for platforms is an important part of this. Platforms right now have the wrong incentives, we need to fix this. It's not just about "truth" but it's also about stealing our attention and time. It's a drug and we should regulate it like the drug it is.
As I recall from my school days, in Social Studies class there were a set of "Critical Thinking" questions at the end of every chapter in the textbook. Never once were we assigned any of those questions.
11 replies →
Some of the worst examples of viral misinformation I've encountered were image posts on social media. They'll often include a graph, a bit of text and links to dense articles from medical journals. Most people will give up at that point and assume that it's legit because the citations point to BMJ et el. You actually need to type those URLs into a browser by hand, and assuming they go anywhere leverage knowledge taught while studying university level stats.
I spent several hours on one of these only to discover the author of the post had found a subtle way to misrepresent the findings and had done things to the graph to skew it further. You cannot expect a kid (let alone most adults) to come to the same conclusion through lessons on critical thinking.
> "IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons."
You just described a perfectly normal "Civics & Current Events" class in early grade-school back when / where I grew up. We were also taught how to "follow the facts back to the actual sources" and other such proper research skills. This was way back when you had to go to an actual library and look up archived newspapers on microfiche, and encyclopedias were large collections of paper books. Y'know... When dinosaurs still roamed the streets... ;)
> IMO we need to teach kids how to identify misinformation in school.
This is extremely difficult. Many of the people who thrive on disinformation are drawn to it because they are contrarian. They distrust anything from the establishment and automatically trust anything that appears anti-establishment. If you tell them not to trust certain sources that’s actually a cue to them to explore those sources more and assume they’re holding some valuable information that “they” don’t want you to know.
The dynamics of this are very strange. A cluster of younger guys I know can list a dozen different times medical guidance was wrong in history from memory (Thalidomide, etc), but when you fact check Joe Rogan they laugh at you because he’s a comedian so you can’t expect him to be right about everything. “Do your own research” is the key phrase, which is a dog whistle to mean find some info to discount the professionals but then take sources like Joe Rogan and his guests at face value because they’re not the establishment.
2 years is a pretty long ban for a not even illegal conduct.
Although if they got banned during the start of covid during the Trump administration then we're talking about 5 years.
No one owes them any distribution at all.
Absolutely. Especially when those election deniers become insurrectionists.
1 reply →
that is a two-way street
They went against a government narrative. This wasn't Google/Youtube banning so much as government ordering private companies to do so.
> wasn't Google/Youtube banning so much as government ordering private companies to do so
No, it was not. It’s particularly silly to suggest this when we have live example of such orders right now.
The companies were nudged. (And they were wrong to respond to public pressure.) The President, after all, has a “bully pulpit.” But there were no orders, no credibly threats and plenty of companies didn’t deplatform these folks.
17 replies →
And do you think the impetuous behind this action happening now is any different? In both cases YouTube is just doing what the government wants.
[flagged]
1 reply →
[dead]
We live in a complicated world, and we do need the freedom to get things right and wrong. Never easy though in times of crisis.
Silver lining in this is the conversation continued and will continue. I can see governments needing to try to get accurate and helpful information out in crisis - and needing to pressure or ask more of private companies to do that. But also like that we can reflect back and go - maybe that didn’t work like what we wanted or maybe it was heavy-handed.
In many governments, the government can do no wrong. There are no checks and balances.
The question is - should we still trust YouTube/Google? Is YouTube really some kind of champion of free speech? No. Is our current White House administration a champion of free speech? Hardly.
But hopefully we will still have a system that can have room for critique in the years to come.
It is scary how close we were to not being able to continue the conversation.
If anything, I think we're even closer. It feels like the current administration is stifling speech more than ever. It's open season on people who don't proudly wave the flag or correctly mourn Charlie Kirk. People who dare speak against Israel are being doxxed and in some cases hounded out of their jobs. Books are being taken off library shelves on the whim of a very few community members with objections. And all of it is getting a giant stamp of approval from the White House.
> Is our current White House administration a champion of free speech? Hardly.
So after January 22 2026, US leaves WHO and youtube users will be able to contradict WHO recommendations
Its odd. People on HN routinely complain how Stripe or PayPal or some other entity banned them unfairly and the overwhelming sentiment is that it was indeed unfair.
But when it comes to this thread, the sentiment mostly is banning is good and we should trust Google made the right choice.
Like the other commenter says, HN isn't a hive mind and doesn't always agree on things.
More than that... different situations usually require different conclusions.
[dead]
I think it would be wise to listen to Nobel Prize-winning journalist Maria Ressa of The Philippines, regarding unchecked social media.
"You and I, if we say a lie we are held responsible for it, so people can trust us. Well, Facebook made a system where the lies repeated so often that people can't tell."
"Both United Nations and Meta came to the same conclusion, which is that this platform Facebook actually enabled genocide that happened in Myanmar. Think about it as, when you say it a million times... it is not just the lie but also it is laced with fear, anger and hate. This is what was prioritized in the design and the distribution on Facebook. It keeps us scrolling, but in countries like Myanmar, in countries like Philippines, in countries where institutions are weak, you saw that online violence became real world violence."
"Fear, anger, hate, lies, salaciousness, this is the worst of human nature... and I think that's what Big Tech has been able to do through social media... the incentive structure is for the worst of who we are because you keep scrolling, and the longer you keep scrolling the more money the platform makes."
"Without a shared reality, without facts, how can you have a democracy that works?"
https://www.cnn.com/2025/01/12/us/video/gps0112-meta-scraps-...
"Beware of he who would deny you access to information for in his heart he dreams himself your master." - Commissioner Pravin Lal, U.N. Declaration of Rights
Full quote: "As the Americans learned so painfully in Earth's final century, free flow of information is the only safeguard against tyranny. The once-chained people whose leaders at last lose their grip on information flow will soon burst with freedom and vitality, but the free nation gradually constricting its grip on public discourse has begun its rapid slide into despotism. Beware of he who would deny you access to information, for in his heart he deems himself your master."
(Alpha Centauri, 1999, https://civilization.fandom.com/wiki/The_Planetary_Datalinks... )
6 replies →
There is a difference between free flow of information and propaganda. Much like how monopolies can destroy free markets, unchecked propaganda can bury information by swamping it with a data monoculture.
I think you could make a reasonable argument that the algorithms that distort social media feeds actually impede the free flow of information.
34 replies →
That sounds great in the context of a game, but in the years since its release, we have also learned that those who style themselves as champions of free speech also dream themselves our master.
They are usually even more brazen in their ambitions than the censors, but somehow get a free pass because, hey, he's just fighting for the oppressed.
2 replies →
Not in the original statement, but as it referenced here, the word 'information' is doing absolutely ludicrous amounts of lifting. Hopefully it bent at the knees, because it my book it broke.
You can't call the phrase "the sky is mint chocolate chip pink with pulsate alien clouds" information.
10 replies →
This is a fear of an earlier time.
We are not controlling people by reducing information.
We are controlling people by overwhelming them in it.
And when we think of a solution, our natural inclination to “do the opposite” smacks straight into our instinct against controlling or reducing access to information.
The closest I have come to any form of light at the end of the tunnel is Taiwan’s efforts to create digital consultations for policy, and the idea that facts may not compete on short time horizon, but they surely win on longer time horizons.
1 reply →
Beware of those who quote videogames and yet attribute them to "U.N. Declaration of Rights".
3 replies →
Beware he who would tell you that any effort at trying to clean up the post apocalyptic wasteland that is social media is automatically tyranny, for in his heart he is a pedophile murderer fraudster, and you can call him that without proof, and when the moderators say your unfounded claim shouldn't be on the platform you just say CENSORSHIP.
The thing is that burying information in a firehose of nonsense is just another way of denying access to it. A great way to hide a sharp needle is to dump a bunch of blunt ones on top of it.
Sure, great. Now suppose that a very effective campaign of social destabilisation propaganda exists that poses an existential risk to your society.
What do you do?
It's easy to rely on absolutes and pithy quotes that don't solve any actual problems. What would you, specifically, with all your wisdom do?
16 replies →
Is your point that any message is information?
Without truth there is no information.
That seems to be exactly her point, no?
Imagine an interface that reveals the engagement mechanism by, say, having an additional iframe. In this iframe an LLM clicks through its own set of recommendations picked to minimize negative emotions at the expense of engagement.
After a few days you're clearly going to notice the LLM spending less time than you clicking on and consuming content. At the same time, you'll also notice its choices are part of what seems to you a more pleasurable experience than you're having in your own iframe.
Social media companies deny you the ability to inspect, understand, and remix how their recommendation algos work. They deny you the ability to remix an interface that does what I describe.
In short, your quote surely applies to social media companies, but I don't know if this is what you originally meant.
Raising the noise floor of disinformation to drown out information is a way of denying access to information too..
Facebook speaks through what it chooses to promote or suppress and they are not liable for that speech because of Section 230.
2 replies →
We must dissent.
There's a special irony in this being the top comment on a site where everyone has a rightthink score and people routinely and flagrantly engage in "probably bad faith, but there's plausible deniability so you can't pin it on them" communication to crap on whatever the wrongthink on an issue is.
As bad as facebook and it's opaque algorithms that favor rage bait are, the kind of stuff you get by keeping score is worse.
>"You and I, if we say a lie we are held responsible for it, so people can trust us."
I don't know how it works in The Philippines, but in the USA the suggestion that media outlets are held responsible for the lies that they tell is one of the most absurd statements one could possibly make.
How about InfoWars?
5 replies →
Ever watched Fox News?
This is why China bans western social media.
Say what you will about the CCP, it's naive to let a foreign nation have this much impact on your subjects. The amount of poison and political manipulation that are imported from these platform is astronomical.
23 replies →
China reflexively bans anything that could potentially challenge Chairman Xi's unchecked authority and control over the information flow.
>unchecked social media
Passive voice. Who exactly is supposed to do the "checking" and why should we trust them?
Citizens. Through lawsuits. Currently we can't because of Section 230.
5 replies →
The problem is not the content, the problem is people believing things blindly.
The idea that we need to protect people from “bad information” is a dark path to go down.
I don't see it so much as protecting people from bad information as protecting people from bad actors, among whom entities like Facebook are prominent. If people want to disseminate quackery they can do it like in the old days by standing on a street corner and ranting. The point is that the mechanisms of content delivery amplify the bad stuff.
3 replies →
Censorship works both ways. When i tried speaking against violence and genocide perpetrated by Russia in Ukraine i was shut down on Linkedin.
Even here on HN, i was almost banned when i said about children abduction by Russia https://news.ycombinator.com/item?id=33005062 - the crime that half year later ICC wrote the order against Putin.
You know how this used to work in the old days? Instead of publishing allegations yourself, you would take your story to a newspaper reporter. The reporter will then do the investigations then, if there is solid evidence, the story will be published in the newspaper. At that point the newspaper company is standing behind the story, and citizens know the standing of the newspaper in their community, and how much credence to give to the story, based on that. Social media destroyed this process, now anyone can spread allegations at lightning speed on a massive scale without any evidence to back it up. This has to stop. We should return to the old way, it wasn't perfect, but it worked for 100s of years. Repealing Section 230 will accomplish this.
17 replies →
[dead]
I can think of another hot-potato country that will get posts nerfed from HN and many others
That's the evil genius behind the general movement in the world to discredit democratic institutions and deflate the government.
Who would hold Meta accountable for the lies it helps spread and capitalize upon them if not the government.
So by crippling democratic institutions and dwarfing the government to the point of virtual non-existence, all in the name of preserving freedom of speech and liberalism -- and in the process subverting both concepts -- elected leaders have managed to neutralize the only check in the way of big corps to ramp up this misinformation machine that the social networks have become.
I think it would be even wiser to start by holding to account the politicians, corporations, and government institutions regarding their unchecked lies corruption and fraud.
But no, yet again the blame is all piled on to the little people. Yes, it's us plebs lying on the internet who are the cause of all these problems and therefore we must be censored. For the greater good.
I have an alternative idea, let's first imprison or execute (with due process) politicians, CEOs, generals, heads of intelligence and other agencies and regulators, those found to have engaged in corrupt behavior, lied to the public, committed fraud, insider trading, fabricated evidence to support invading other countries, engage in undeclared wars, ordered extrajudicial executions, colluded with foreign governments to hack elections, tax evasion, etc. Then after we try that out for a while and if it has not improved things, then we could try ratcheting up the censorship of plebs. Now one might argue that would be a violation of the rights of those people to take such measures against them, but that is a sacrifice I'm willing to make. Since We Are All In This Together™, they would be willing to make that sacrifice too. And really, if they have nothing to hide then they have nothing to fear.
When you get people like Zuckerberg lying to congress, it's pretty difficult to swallow the propaganda claiming that it's Joe Smith the unemployed plumber from West Virginia sharing "dangerous memes" with his 12 friends on Facebook that is one of the most pressing concerns.
I don't think "breadwinner" is blaming the little people.
2 replies →
Exactly what are you trying to say about unbanning YouTubers here?
That it could be dangerous to readmit people who broadcast disinformation? The connection seemed pretty clear to me.
14 replies →
[dead]
Better article: https://www.businessinsider.com/youtube-reinstate-channels-b...
Actual letter: https://judiciary.house.gov/sites/evo-subsites/republicans-j...
Good editorial: https://www.businessinsider.com/google-meta-congress-letter-...
All those words, and no mention of Section 230, which is what this is really all about. Google can see which way the wind is blowing and they know POTUS will -- for better or worse -- happily sign any anti-"Big Tech censorship" bill that gets to his desk. They hope to preempt this.
Yes, I know about the Charlie Kirk firings etc.
Ok, we've changed the URL above to that first link from https://www.offthepress.com/youtube-will-let-users-booted-fo.... Thanks!
Two articles that I found offered a well-rounded analysis:
- https://www.engadget.com/big-tech/youtube-may-reinstate-chan...
- https://arstechnica.com/gadgets/2025/09/youtube-will-restore...
The problem with any system like this is that due to scale it will be automated which means a large swath of people will be caught up in it doing nothing wrong.
This is why perma bans are bad. Id rather a strike system before a temp ban to give some breathing room for people to navigate the inevitable incorrect automation. Even then if the copyright issue is anything to go by this is going to hurt more than help.
What exactly constituted a violation of a COVID policy?
A lot of channels had to avoid even saying the word Covid. I only saw it return recently to use at the end of last year. There were a variety of channels banned that shouldn't have been such as some talking about Long Covid.
Now you see channels avoiding saying "Gaza" or "genocide". I haven't seen any proof platforms are censoring at least some content related to Israel but I wouldn't be surprised.
Every opinion different from the opinion of "authorities". They documented it here:
https://blog.youtube/news-and-events/managing-harmful-vaccin...
From the two links in the post, Google fleshes it out in great detail, with many examples of forbidden thought.
[flagged]
28 replies →
Saying lab leak was true
According to Google's censorship algorithm, Michael Osterholm's podcast (famous epidemiologist and, at the time, a member of President Biden's own gold-star covid-19 advisory panel).
https://x.com/cidrap/status/1420482621696618496 ("Our Osterholm Update podcast episode (Jul 22) was removed for “medical misinformation.”" (2021))
Most ironic thing I've ever seen. I still recall it perfectly, though it's been four years. Never, ever trust censorship algorithms or the people who control them: they are just dumb parrots that suppress all discussion of an unwanted topic, without thought or reason.
My wake up moment was when they not only took down a Covid debate with a very well qualified virologist, but also removed references to it in the Google search index, not just for the YouTube link.
8 replies →
[flagged]
[flagged]
Still curious if the White House made them pin those vaccine videos on the homepage, then disable dislikes.
It seems to me that a lot of people are missing the forest for the trees on misinformation and censorship. IMO, a single YouTube channel promoting misinformation, about Covid or anything else, is not a huge problem, even if it has millions of followers.
The problem is that the recommendation algorithms push their viewers into these echo chambers that are divorced from reality where all they see are these videos promoting misinformation. Google's approach to combating that problem was to remove the channels, but the right solution was, and still is today, to fix the algorithms to prevent people from falling into echo chambers.
I've argued this before, but the algorithms are not the core problem here.
For whatever reason I guess I'm in that very rare group that genuinely watches everything from far-right racists, to communists, to mainstream media content, to science educational content, to conspiracy content, etc.
My YT feed is all over the place. The algorithms will serve you a very wide range of content if you want that, the issue is that most people don't. They want to hear what they already think.
So while I 100% support changing algorithms to encourage more diversity of views, also I think as a society we need to question why people don't want to listen to more perspectives naturally? Personally I get so bored here people basically echo what I think. I want to listen to people who say stuff I don't expect or haven't thought about before. But I'm in a very significant minority.
I might agree that the algos making recommendations on the sidebar might not matter much, but the algos that control which videos show up when you search for videos on Google, and also in YouTube search absolutely do matter.
The problem with this is that a lot of people have already fallen into these misinformation echo chambers. No longer recommending them may prevent more from becoming unmoored from reality, but it does nothing for those currently caught up in it. Only removing the channel helps with that.
Algorithms that reverse the damage by providing opposing opinions could be implemented.
2 replies →
I don't think those people caught up in it are suddenly like "oop that YouTuber is banned, I guess I don't believe that anymore". They'll seek it out elsewhere.
2 replies →
Yeah, there are two main things here that are being conflated.
First, there's YouTube's decision of whether or not to allow potentially dangerous misinformation to remain on their site, and whether the government can or did require them to remove it.
Second, though, there's YouTube's much stronger editorial power: whether or not to recommend, advertise, or otherwise help people discover that content. Here I think YouTube most fairly deserves criticism or accolades, and it's also where YouTube pretends that the algorithm is magic and neutral and they cannot be blamed for actively pushing videos full of dangerous medical lies.
Why. Why is Google obligated to publish your content? Should Time Magazine also give you a column because they give others space in their pages? Should Harvard Press be required to publish and distribute your book because they do so for others.
These companies owe you nothing that's not in a contract or a requirement of law. That you think they owe you hosting, distribution, and effort on their algorithm, is a sign of how far off course this entire discourse has moved.
The problem is that misinformation has now become information, and vice versa, so who was anyone to decide what was misinformation back then, or now, or ever.
I like the term disinformation better, since it can expand to the unfortunately more relevant dissenting information.
[dead]
The algorithm doesn't push anyone. It just gives you what it thinks you want. If Google decided what was true and then used the algorithm to remove what isn't true, that would be pushing things. Google isn't and shouldn't be the ministry of truth.
Exactly, they shouldn't be the ministry of truth. They should present balanced viewpoints on both sides of controversial subjects. But that's not what they're doing right now. If you watch too many videos on one side of a subject it will just show you more and more videos reinforcing that view point because you're likely to watch them!
2 replies →
"what it thinks you want" is doing a lot of work here. why would it "think" that you want to be pushed into an echo chamber divorced from reality instead of something else? why would it give you exactly what you "want" instead of something aligned with some other value?
1 reply →
Canada has a tyrannical style government that has been censoring speech. I had a discussion recently with a liberal who was arguing that it's a good thing the government is censoring the speech of their political opponents. That free speech comes with consequences.
My argument, free speech is a limit on the government. Give them as much consequences you please but not with government power.
That's the problem here, Democrats were using government power to censor their political opponents; but they wouldnt have been able to do it without government power.
Without over-doing it, as a non-american, not resident in the USA, It is so very tempting to say "a problem of your making" -but in truth, we all have a slice of this because the tendency to conduct state policy by mis-truths in the media is all-pervasive.
So yes. This is a problem rooted in the USA. But it is still a problem, and it's a problem for everyone, everywhere, all the time.
I'm banned from posting in a couple subreddits for not aligning with the COVID views of the moderators. Lame.
I was banned because a moderator misunderstood my single word answer to another post.
Reddit bans aren‘t an indicator for anything
Whenever someone says "i was banned from ..." take what they say with a huge grain of salt.
On Reddit, you can get banned from some subreddits simply because you have posted in another completely different sub (regardless of the content of the post).
It's not even always politics, although that's certainly a major driving force. But then you have really stupid fights like two subs about the same topic banning each others' members.
1 reply →
Everybody here is strangers online, so I think grains of salt are reasonable all around. That said, I'm not sure that people-who-were-banned deserve above average scrutiny. Anecdotally, a lot of the RubyGems maintainers were banned a week ago. It seems really unfair to distrust people _just_ because a person-in-control banned them.
2 replies →
The problem (?) with Reddit is that the users themselves have a lot more control over bans than on other social media where it is the platform themselves that do the banning. This makes bans much more arbitrary even than on Facebook and et al.
1 reply →
Reddit (both admins and many subreddit moderators) are extremely trigger happy with bans. Plenty of reasonable people get banned by capricious Reddit mods.
That's the funny thing about reddit. You can get banned trivially on a whim of a mod. I've been banned from multiple subreddits that I've never been to. Simply because I posted on another subreddit and that mod found detestable.
My favourite. I'm trans/autistic. I was posting on r/autism being helpful. OP never mentioned their pronouns, just that they have a obgyn and feminine problems. I replied being helpful. but I misgendered them and they flipped out. Permabanned me from r/asktransgender, even though i never posted on it. Then left me a pretty hateful reply on r/autism. Reddit admins give me a warning for hate toward trans people. Despite me never doing any such thing and being one.
Right about the same time r/askreddit had a thread about it being hard not to misgender trans. So i linked this thread, linking an imgur of the reddit admin warning. I went to like 30,000 upvotes. r/autism mods had to reply saying they dont see any hate in my post and that people should stop reporting it.
I was banned because I was simply in a covid sub debating with the covid-deniers. The "powers-that-be" mods literally banned anyone on that particular sub from popular subs, some of which I hadn't even been in, ever. There was (is?) a cabal of mods on there that run the most popular subs like pics/memes/etc that definitely are power hungry basement dwellers that must not have a life.
Stop excusing it. It's a very real, very serious problem with Reddit. They're very much abusive on this and many other topics
1 reply →
Eh, I was banned from several major subreddits for simply posting in a conversative subreddit, even though my post was against the conservative sentiment.
2 replies →
I was banned from a subreddit and then Reddit itself for intentionally and egregiously violating several of the rules
Arguing online about the merits of free speech is as paradoxical as having discussions about free will.
I think you have a shallow understanding of both free speech and free will if you think this is the gotcha you seem to think it is. Why couldn't people have discussions about free will in a determinist universe? They could be weaved by the laws of physics into having them.
As for free speech online, do you think there should be no limit to what can be said or shared online? What about pedophilia or cannibalism? Or, more relevantly, what about election-denialism, insurrectionism or dangerous health disinformation that are bound to make people act dangerously for themselves and society as a whole? Point is, free speech is never absolute, and where the line is drawned is an important conversation that must be had. There is no easy, objective solution to it.
There is an evolution from Luther to the Internet. But lets not pretend to know a reversal when we see it.
I also cringed at your list.
"what about election-denialism"
I dont think I can help you.
3 replies →
Social media and lack of scientific research literacy is going to eventually prove to be fatal for modern society, even with this Tylenol thing, I have on one side people that believe a study blindly without reading that it's not taking into consideration several important variables and more studies are needed, and on the other hand I have people that did not read at all the study saying that it's impossible Tylenol could be causing anything because it is the only pain med pregnant women can take... clear non understanding of how controlled trials work...
Same thing with the UFO "Alien" video that was "shot down" by a hellfire missile (most likely a balloon), people just automatically assume that because it was said in congress it has to be true, zero analysis whatsoever of the footage or wanting to seek analysis by an expert, nope, it must be an alien.
There is so much misinformation, so much lack of understanding, and so many people, from every side that just have complete and utter lack of understanding of how seemingly basic things work, I am afraid for the future.
But yeah! let's unban unscientific sources, oh and people who are okay with a literal coup on a democracy.
Prediction, nobody will be unbanned because they'll all be found to have committed other bannable offenses. Youtube gives Trump a fake win while actually doing nothing.
More speech! The signal vs. noise-ratio shifts. So access to information will become more difficult. More disinformation and outright nonsense will make it more difficult to get to the valuable stuff. Ok - let‘s see how that works!
YouTube is like old school televison - at different scale, they have to answer to politics and society. Our videos are their line up.
They should bring back the content too. When history books are written the current state of things is misleading.
The world is going backwards rapidly. The worst people are once again welcomed into our now-crumbling society.
I'm shocked at how often people flip-flop their arguments when discussing private entities censoring speech. It's frustrating because it feels like the only speech allowed today is right-wing commentary. When Democrats were in power, it seemed like only left-wing commentary was permitted. It's baffling that, despite our education, we're missing the point and stuck in this polarized mess.
In other news (unrelated, I'm sure):
"DOJ aims to break up Google’s ad business as antitrust case resumes"
https://arstechnica.com/gadgets/2025/09/google-back-in-court...
They have a desperate need for false-victimhood.
Without their claim to victimization, they can't justify their hatred.
I'm not sure why they would, it's kind of a dumb move. They aren't violating anyone's freedom of speech by banning disinformation and lies. It's a public service, those people can head on over to one of the many outlets for that stuff. This is definitely a black mark on YouTube.
There isn't really a good solution here. A precedent for banning speech isn't a good one, but COVID was a real problem and misinformation did hurt people.
The issue is that there is no mechanism for punishing people who spread dangerous misinformation. It's strange that it doesn't exist though, because you're allowed to sue for libel and slander. We know that it's harmful, because people will believe lies about a person, damaging their reputation. It's not clear why it can't be generalized to things that we have a high confidence of truth in and where lying is actively harmful.
No speech was banned. Google didn't prevent anyone from speaking. They simply withheld their distribution. No one can seem to get this right. Private corporations owe you almost nothing and certainly not free distribution.
In the article it mentions that Google felt pressured by the government to take the content down. Implying that they wouldn't have if it wasn't for the government. I wasn't accusing Google of anything, but rather the government.
Maybe it's not banning, but it doesn't feel right? Google shouldn't have been forced to do that and really what should've happened is that the people that spread genuine harmful disinformation, like injecting bleach, the ivermectin stuff or the anti-vax stuff, should've faced legal punishment.
What if the government is the source of misinformation?
It's interesting you say that, because the government is saying Tylenol causes autism in infants when the mother takes it. The original report even says more verification is required and it's results are inconclusive.
I wouldn't be surprised if some lawsuit is incoming from the company that manufactures it.
We have mechanisms for combatting the government through lawsuits. If the government came out with lies that actively harm people, I hope lawsuits come through or you know... people organize and vote for people who represent their interests.
It certainly happens we're currently flooded with it from current regime
- Tylenol causes autism
- Vaccines cause autism
- Vaccines explode kids hearts
- Climate change is a hoax by Big Green
- "Windmill Farms" are more dangerous for the environment than coal
- I could go on but I won't
Virtually all of the supposed misinformation turned out not to be that at all. Period, the end. All the 'experts' were wrong, all those that we banned off platforms (the actual experts) were right
[flagged]
Even more misinformation, Russian propaganda and bots to sift through in the recommendations and comments, got it!
Misinformation, disinformation, terrorism, cancel culture, think of the children, fake news, national security, support our troops, and on and on. These will be used to justify censorship. Those who support it today may find out it's used against them tomorrow.
i'd like to think that if I were a YTer that got banned for saying something that I believed in that I would at least have the dignity not to take my value back to the group that squelched me.
..but i'm not a yter.
It's showbiz. For those making actual money there, sacrificing dignity is the price of entry.
The amount of flagged hidden comments here by the supposed anti censorship side is almost funny.
If you (or anyone) run across a flagged comment that isn't tediously repeating ideological battle tropes, pushing discussion flameward, or otherwise breaking the site guidelines, you're welcome to bring it to our attention. So far, the flagged comments I've seen in this thread seem correctly flagged. But we don't see everything.
On this site, we're trying for discussion in which people don't just bash each other with pre-existing talking points (and unprocessed rage). Such comments quickly flood the thread on a divisive topic like this one, so flagging them is essential to having HN operate as intended. To the extent possible at least.
(oh and btw, though it ought to go without saying, this has to do with the type of comment, not the view it's expressing. People should be able to make their substantive points thoughtfully, without getting flagged.)
https://news.ycombinator.com/newsguidelines.html
Flagging isn’t the worst that can happen, you could also be rate limited what prevents you from answering in a discussion because of „you are posting too fast“
I know what I‘m talking about
3 replies →
Yeah but in practice this isn't actually the case, people flag all the time for people just having a dissenting opinion, fitting none of the categories you mentioned
6 replies →
[dead]
tl;dr The Biden Administration has been caught using the government to force Twitter, YouTube and Facebook to censor its political enemies.
They never forced them, and they certainly never said "that's a nice merger you got there, it would be awful if something were to happen to it" per the current policies of the US government.
Yes they did https://www.npr.org/2021/07/22/1019346177/democrats-want-to-...
leftism is truly an inversion of reality - current govt is not outsourcing censorship to do end run around 1A, Biden admin did.
[dead]
[dead]
So absolutely no one involved will have any repercussions. So they will all do it over again at the next opportunity
> they will all do it over again at the next opportunity
Future tense?
They are mega-corporations. They always do what ever the hell they want, certainly absent your input. Did you really believe they don't do what ever they want, because that's pretty damned naive.
yeah, 2025 in a nutshell. The year of letting all the grifts thrive.
What should the punishment be for having opinions the govt disagrees with?
Promoting medical misinformation or even health misinformation should be critically judged. Alternative health companies are rubbing their hands together.
The next Drain-o chug challenge "accident" is inevitable, at this rate.
2 replies →
Notoriety
1 reply →
[dead]
[flagged]
[flagged]
Far too many people are free speech hypocrites.
who doesn't get free speech?
[flagged]
5 replies →
[flagged]
[flagged]
[flagged]
Steelman argument is it's better to know what liars, bigots, and other naughty people are up to than push them entirely underground. And someday future moderators may think you're naughty/lying/a quack/etc.
IMO we should not let private platforms become near monopolies, and certainly not without regulation, since they become a defacto public square. But if we're going to let them eat the world, then hopefully they'll at least use good judgment and measures like de-ranking or even banning folks who encourage others to do harm. Making bans temporary is a safety valve in case of bad moderation.
That steelman is still a pretty bad argument, though. I don't see why giving liars, bigots and other naughty people a megaphone is required in order to know what they're saying.
7 replies →
What is Youtube a 'near monopoly' in? Online video.....? Do you have any idea how much video there is online that's not on Youtube? They don't meet the legal definition of a monopoly
People change/make mistakes. Permanent bans are rarely a good idea.
Earlier in 2025, the video game Fortnite announced[1] that they were giving cheaters with lifetime bans a "second chance" and let them return to the game. Lo and behold, cheating in the game spiked up this year and has returned as a huge ongoing problem. Turns out, the vast majority of the bans were probably correct, and when you let people back into something who were banned for doing X, they're going to immediately start doing X again once they're back in.
1: https://www.fortnite.com/news/fortnite-anti-cheat-update-feb...
1 reply →
[dead]
Admittedly, Google was very heavy handed with Covid censorship. Sure, there was a lot of genuine misinformation that maybe deserved it, but they also tended to catch a lot of actual qualified scientists engaging in scientific debate (say, arguing in favor of masks and the transmission through air theory in the early days) or even some discussion that wasn't opposing the official stances.
Somewhat related, it's pretty insane how even to this day YouTubers have to avoid referring to by name a global multi-year situation that everyone who existed at the time went through. It's due to advertisers rather than government pressure, but still, insane.
Yeah at the time I get the impression they were banning dissent, not just egregious or dangerous content (whatever that even means). I though most places came to their senses a long time ago and walked back that heavy handedness, I'm surprised this just happened.
Your point reminded me that around the time when the pandemic first started, I saw a YouTube video on physics titled something like "Corona and Arc Discharge" and it had the contextual note that is sometimes added to videos. I think the official name YouTube gives it is: "topical context in information panel". I thought it was a funny case where the automated system thought this physics video had something to do with COVID.
Merriam Webster defines con man as "a person who tricks other people in order to get their money : con artist"
Even if people were straight up wrong about their COVID-19 theories, I don't think many of the banned people were trying to get viewers to send them money.
> trying to get viewers to send them money.
They were trying to get viewers to get money. It's an important distinction.
We both know that ads and sponsorships are a significant way influencers monetize their viewers.
All they have to do is lie to attract eyeballs and they make money. E-begging isn't necessary, the platforms allow you to extract value from viewers at an incredible scale.
First, let's dispense with the idea that anybody is a free speech absolutist. Nobody is. No site is. Not even 4chan is (ie CSAM is against 4chan ToS and is policed).
Second, some ideas just aren't worth distributing or debating. There's a refrain "there's no point debating a Nazi". What that means is there is a lot of lore involved with being a modern Nazi, a labyrinth of conspiracy theories. To effectively debate a Nazi means learning all that lore so you can dismantle it. There's no point. In reality, all you end up doing is platforming those ideas.
I'm actually shocked at how ostensibly educated people fall into the anti-vax conspiracy trap. Covid definitely made this worse but it existed well before then. Certain schools in San Francisco had some of the lowest child vaccination rates in the country.
As a reminder, the whole vaccine autism "theory" originated from one person: Andrew Wakefield. He was a doctor in the UK who was trying to sell a vaccine. The MMR vaccine was a direct compeititor so he just completely made up the MMR link to autism. He his medical license because of it. But of course he found a receptive audience in the US. He is and always was a complete charlatan.
Likewise, the Covid anti-vax movement was based on believing random Youtube videos from laymen and, in many cases, an intentional ignorance in the most esteemed traditions of American anti-intellectualism. People who are confidently wrong about provably wrong things who had no interest in educating themselves. Some were griters. Some were stupid. Many were both.
We had people who didn't understand what VAERS was. (and is). We had more than 10 million people die of Covid yet people considered the vaccine "dangerous" without any evidence of side effects let alone death. As one example, you had people yelling "J'accuse!" at hints of myocardial inflammation from the vaccine. But you know what else causes myocardial inflammation? Getting Covid.
If you're excited by this move, it just further highlights that you have no idea whta's going on and zero interest in the truth. What's happening here is big tech companies capitulating to the fringe political views of the administration, a clear First Amendment violation, to curry favor, get their mergers approved, get cgovernment contracts and so on.
Regardless of your views on this or any otehr issue you should care about capitulation by social media sites in this way.
This comments on this post are just a graveyard of sadness.
The problem with those "ideas that just aren't worth" is the usual, who decides?
In my country of origin, you get called a Nazi simply for being opposed to the war of aggression that it is currently engaged in. In US, we have a long history of "terrorist" and "extremist" being similarly abused.
Do you think it's a good idea that this administration gets to decide what is and isn't acceptable speech? That's one of my points. So regardless of your positions on Covid and the 2020 you shouldn't celebrate this move because the government shouldn't have this kind of influence.
1 reply →
> Google's move to reinstate previously banned channels comes just over a year after Meta CEO Mark Zuckerberg said [...] that the Biden administration had repeatedly pressured Meta in 2021 to remove content related to COVID-19. "I believe the government pressure was wrong, and I regret that we were not more outspoken about it," Zuckerberg wrote in the August 2024 letter.
I'm sure Zuckerberg will say the same thing in 2029 too if the ruling party changes again. Until then, removing fact-checking and letting conspiracy theorists have their freedom of speech while suppressing voices critical of the current administration will make that change less likely...
is there any political censorship scheme at this large of scale in modern us history?
Yes; the way the us government big business and the media, specifically Hollywood colluded during the Cold War
So great to see the censorship apparatus in full swing on HN. Lots of great comments into the dust bin.
I think hardware and ip level bans.. should be banned.
I know that some services do this in addition to account ban.
Any service which allows user generated content and allows arbitrary IP addresses to create infinite accounts is guaranteed to be overrun with CSAM. It's practically a law of physics.
If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life at the hands of actual authorities. Websites banning such posters only serves to alert them that they need to improve their tactics and give them the opportunity to hide. Removing only the offending content and alerting authorities is the appropriate thing a website like Youtube should be doing.
Even if one does argue that CSAM should result in hardware and IP bans, there's no reason that can't be a sole exception to a wider prohibition on such bans.
11 replies →
So the other day, I linked to something on Rumble right here on Hacker News and was told to find a better source
First of all, you can't separate a thing's content from the platform it's hosted on? Really?
Second of all, this is why
I'll just go do this again and if you flag me it's on you, you have no standing to do it (the internet is supposed to be democratic, remember?)
https://rumble.com/v28x6zk-sasha-latypova-msc.-nsa-team-enig...
https://rumble.com/v3zh3fh-staggering-17m-deaths-after-covid...
https://rumble.com/vt62y6-covid-19-a-second-opinion.html
https://rumble.com/v2nxfvq-international-covid-summit-iii-pa...
I could go on. Feel free if you want to see more. :)
(Was it misinformation when Fauci said you shouldn't rush a vaccine or all hell breaks loose years later? Or when he intimated that masks wouldn't work for covid?)
The reason why you are asked better source is because, and let me say this slowly, anyone can post any crap on the internet without repercussions. Lets start with the one that references "Sasha Latypova". If I search her credentials she earned a title on Master of Business Administration, except that she used that to work as a co-founder of two companies, and none of them are even adjacent to pharmacology, but she is a "global PHARMA regulation expert". I'm sure that the other persons there will not have those issues, right?
“And let me say this slowly” No point in typing this out - it is condescending to the parent poster.
1 reply →
Boo