Comment by Canada
1 year ago
Let's stop repeating this word "moderate" when what we're talking about is censorship.
Moderation is what happens here on HN: Admins have some policies to keep the conversation on track, users voluntarily submit to them.
Censorship is when a third party uses coercion to force admins to submit to them and remove posts against their will.
Durov has been arrested for refusing to implement censorship, not for anything concerning moderation.
The only difference between "moderation" and "censorship" is whether you like the policy or not.
No, it's definitely not. Moderation means I can run my group how I want, you can run your group how you want, and others can decide if they want to participate in either of our groups or start their own groups.
Censorship is when someone else dictates how we can run our respective groups.
I fail to see the difference. I think the term "moderation" has become popular mainly because everyone is aware of the negative connotations of the term "censorship". Re-labelling acts of censorship as "moderation" allows you to frame it as a responsible and morally justified practice. Now you're no longer quelling dissenting voices like a fascist dictator, no, you're merely "protecting the community", "combating misinformation" or whatever they call it these days.
Basically, it's "moderation" when we do it but it's "censorship" when the Chinese communist party does it.
I don't know how much you have used Telegram, but it's ridden with absolutely vile stuff.
You open the "Telegram nearby" feature anywhere and it's full of people selling drugs and scams. When I mistyped something in the search bar I ended up in some ISIS propaganda channel (which was straight up calling for violence/terrorism). All of this on unencrypted public groups/channels ofc (I'm pretty sure it's the same with CP, although I'm afraid to check for obvious reasons).
I think there is a line between "protecting free speech" and being complicit in crime. This line has been crossed by Telegram.
I use it a lot, and I run some large groups on it. I don't see any of that stuff, I've never gone looking for it, and I'm not even sure how to look for it. Can you tell me some examples of what to search for to see what you're talking about?
it's not specific to anything but humans, which are ridden with vile stuff.
just turn off any discovery and suggestion features
> Censorship is when a third party uses coercion to force admins to submit to them and remove posts against their will
What a weird hill to die on, given the whole context of this situation.
Do you see public recruitment of people into terrorist cells as a freedom of speech? Do you see publicly selling drugs as a freedom of speech? It isn't about censorship at all, it's about actual *illegal* activity.
Now it's up to Durov and his lawyers to prove that Telegram actually dealt with that. So far France doesn't seem convinced.
Terrorist recruitment and selling drugs is conduct, and whoever engaging in that illegal conduct can, and should, be prosecuted.
The problem I have is with requiring the chat service to police that or making its operators liable for the illegal conduct of its users.
It shouldn't be up to Durov to prove he did or didn't do anything, it's up to France to prove that he or his company actively participated such conduct. And no, people using the service to engage in the illegal acts isn't nearly enough, any more than Google's CEO should be liable for a drug dealer using Maps to navigate to the drug deal location, or Venmo should be liable for the buyer paying the seller with it.
The reason it's worth defending this "hill" is because allowing governments to use censorship as a convenient means of solving these problems always leads to more control and restrictions that infringe on the legitimate rights of everyone.
I understand the appeal of these tactics. Since we know that terrorist groups operating abroad will use chat services to incite locals to commit violence, it's tempting to search the chat service and stop that from happening by censoring the communication, preventing the radicalization. Since we know that drug sellers organize the sale of the contraband using the chat app, it's tempting to search the chat app and censor that speech, thus preventing the buyer from learning where to meet the seller. Or wait for enough speech to cross the line into conduct and then arrest them for it. Sounds great. If it would work, I'd support it.
The problem is that it won't work, and the only way to "fix it" will be to push more and more and more surveillance and control. It's already being pushed. Look at this chat control nonsense. Do you support that?
So what I'm saying, is let's just recognize that it's a basic human right for people to communicate freely and that operators of communication services shouldn't be held liable for the actions of their users.
Yes but let's also be clear that some forms of speech censorship are widely and broadly supported in public, 'town square' or broadcast media situations. Things like child porn, personal threats, calling for or organizing violence, hate speech, etc. Laws and social acceptance of this kind of censorship, of course, differ in different regions.
Hacker news may 'moderate' illegal content on this website, but they don't have a choice in the matter, US or State authorities will shut them down if they do not, so it's technically censorship. Your view on whether this is good or bad will depend on many factors, one of which may be how you view the legal structure of your government, which is substantially different in France, the US, or Dubai (where Telegram is located).
As is mentioned in the article, Telegram is not simple a 'secure messaging app'. They are also serving a role similar to Facebook, Twitter, Instagram, or TikTok. They host publicly accessible channels or public group chats with thousands of members, which are all (apparently) unencrypted and accessible to the Telegram company. It may be reasonable (both legally and socially) to expect that a company which has knowledge of public, illegal speech to take steps to remove that content from their platform.
And Durov, by choosing to be a media company and not E2E encrypt all of his user's private communications, has walked right into a situation where he needs to abide by local laws moderating/censoring illegal content, everywhere.
> Moderation is what happens here on HN: Admins have some policies to keep the conversation on track, users voluntarily submit to them.
What do you mean by users voluntarily submitting to these policies? This distinction seems key in your argument, but I don't see what alternatives to submitting I have here, making it involuntary, right?
No, you miss the point.
If HN decided to ban all posts about Donald Trump that is moderation. Users voluntarily submit to this policy by participating in the site, and if they do not, they will be banned.
If the State of California required that all web sites run from their state are REQUIRED to ban all posts about Donald Trump, that is censorship.
Moderation is "your house, your rules" while censorship is someone else imposing their rules in your house.
Do you see what I'm saying? When France is talking about "moderation" of Telegram, what they actually mean is censorship.
A pedantic point, which typically argues around the real point. When somebody egregiously violates norms of public discourse with rabble-rousing, slander, deliberate lies and obfuscation, it's reasonable to limit their message's reach with some rules. When they continue despite warnings, then something more has to be done.
Call it what you like; this all had a history and a progression. Not arbitrary or unfair.
Thanks, I see and agree.
is removal of CSAM moderation or censorship?
It depends on whether the parties to the communication want that or not.
So let's say a few child molesters create a chat service and use it to send the worst, most horrible child pornography amongst themselves. Removing it is censorship, not moderation.
Look, I'm not trying argue for legalization of child pornography here. That is illegal contraband, full stop. The intent of my comment is to say "let's just call it what it is"
I think the overwhelming consensus is that child pornography is so horrible that mere possession of it must be CENSORED.
I'm not arguing that censorship is always wrong. For instance, I don't want to see public billboards of graphic sex or violence. I think it's good that we censor that, so that we aren't forced to look at things like that when we don't want to.
What is bothering me is that proponents of censorship, and especially certain proponents of it who want to use it as a tool to suppress ideas they don't like, have recently started using the word "moderation" in order to sneak their plans into policy without raising objections. The reason is because when we hear the word "censorship" we immediately think, "Whoa, hold on there, censorship is very harsh, let's take a hard look and make sure this is serious enough that resorting to censorship is justified and appropriate", whereas when we hear the word "moderation" we think, "Of course, we all appreciate someone deleting the spam and trolls who annoy us", and we're less likely to think critically about exactly what kind of expression is being legally prohibited.
[dead]