← Back to context

Comment by Canada

1 year ago

Terrorist recruitment and selling drugs is conduct, and whoever engaging in that illegal conduct can, and should, be prosecuted.

The problem I have is with requiring the chat service to police that or making its operators liable for the illegal conduct of its users.

It shouldn't be up to Durov to prove he did or didn't do anything, it's up to France to prove that he or his company actively participated such conduct. And no, people using the service to engage in the illegal acts isn't nearly enough, any more than Google's CEO should be liable for a drug dealer using Maps to navigate to the drug deal location, or Venmo should be liable for the buyer paying the seller with it.

The reason it's worth defending this "hill" is because allowing governments to use censorship as a convenient means of solving these problems always leads to more control and restrictions that infringe on the legitimate rights of everyone.

I understand the appeal of these tactics. Since we know that terrorist groups operating abroad will use chat services to incite locals to commit violence, it's tempting to search the chat service and stop that from happening by censoring the communication, preventing the radicalization. Since we know that drug sellers organize the sale of the contraband using the chat app, it's tempting to search the chat app and censor that speech, thus preventing the buyer from learning where to meet the seller. Or wait for enough speech to cross the line into conduct and then arrest them for it. Sounds great. If it would work, I'd support it.

The problem is that it won't work, and the only way to "fix it" will be to push more and more and more surveillance and control. It's already being pushed. Look at this chat control nonsense. Do you support that?

So what I'm saying, is let's just recognize that it's a basic human right for people to communicate freely and that operators of communication services shouldn't be held liable for the actions of their users.