Comment by cameronh90
7 months ago
Zero chance it will be enforced like this.
The UK has lots of regulatory bodies and they all work in broadly the same way. Provided you do the bare minimum to comply with the rules as defined in plain English by the regulator, you won't either be fined or personally liable. It's only companies that either repeatedly or maliciously fail to put basic measures in place that end up being prosecuted.
If someone starts maliciously uploading CSAM and reporting you, provided you can demonstrate you're taking whatever measures are recommended by Ofcom for the risk level of your business (e.g. deleting reported threads and reporting to police), you'll be absolutely fine. If anything, the regulators will likely prove to be quite toothless.
Hopefully the new law is enforced sensibly, i.e., with much leniency given to smaller defendants, but hoping for that to be the case is a terrible strategy. The risk is certainly not zero as you claim -- all it takes is for one high-profile case of leniency resulting in some terrible outcome (e.g., child abuse) getting into the news, and the government employees responsible for enforcement will snap to a policy of zero-tolerance.
> provided you can demonstrate you're taking whatever measures are recommended by Ofcom
That level of moderation might not be remotely feasible for a sole operator. And yes, there's a legitimate social question here: Should we as a society permit sites/forums that cannot be moderated to that extent? But the point I'm trying to make is not whether the answer to that question is yes or no, it's that the consequences of this Act are that no sensible individual person or small group will now undertake the risk of running such a site.
[dead]
Yeah we’ve seen how competently and fairly laws are enforced in the uk with the post office scandal.