← Back to context

Comment by LorenPechtel

4 years ago

The real problem here is companies are not cops and should quit acting like cops.

The instant a company has evidence of a possible crime being committed they should be required to hand the evidence over to the police and then take no other action other than preventing distributing it or the like.

This is not just Google's AI goofing up on what constitutes CSAM (and it sounds like given the witch hunt about such things that Google was being reasonable in informing the police), but colleges expelling "rapists" without evidence etc. The accused never gets anything resembling a fair trial but since it's not the government doing it that doesn't matter, there's no repercussions from messing up lives based on completely incompetent investigations.

They may not be cops but they’ve created an enabling technology. They’re also the only ones who could access the data and recognize its potential for abuse. It’s not an easy situation.

But clearly if they’re referring out to law enforcement, they need to close the loop on that and take responsibility when they get it wrong.

  • This is more than just Google and CSAM. We have a more general problem with companies playing cop--and generally doing a terrible job of it. This case is simply one example of the problem, we should be focusing on the bigger picture.