← Back to context

Comment by _fjg8

4 months ago

The opposite is true. The new law makes it considerably more risky for large companies because the law is specifically designed to hold them to account for conduct on their platforms. The (perceived) risk for small websites is unintended and the requirements are very achievable for small websites. The law is intended for and will be used to eviscerate Facebook etc. for their wrongs. We are far more likely to see Facebook etc. leave the UK market than we are see any small websites suffer.

A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.

> A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.

Facebook can actually train AI to detect CSAM, and is probably already doing so in cooperation with NCMEC and similar organisations/authorities across the world.

Your average small website? No chance. Obtaining training material actively is seriously illegal everywhere, and keeping material that others upload is just as bad in most jurisdictions.

The big guys get the toys, the small guys have to worry all the goddamn time if some pedos are going to use their forum or whatnot.

No, that is not how it works. Large companies can afford compliance costs. Smaller ones can't.

  • I believe file uploading services like cloudinary have this capability already. It does have a cost, but it exists.

    • But you shouldnt need to use file uploading services! File upload doesnt require additional services, it has been a well understood part of HTTP for decades. You can do file upload using normal web form submission in your web server/CMS/Rails/Laravel/CGI program without paying a monthly subscription to some service at an exorbitant markup.

      Also, those filters are obviously imperfect. Remember the man who got his Google account terminated because he took a photo of his son's rash to send to his doctor? Pedo alert, pedo alert, a child is naked in a photo. My parents must be pedos too, they took a photo of me sitting in the bath when I was a toddler. Call the police.

  • What are the compliance costs for this law that would apply to a small independent forum?

    • Have you run a forum, in, say, the last decade? The amount of spam bots constantly posting links to everything from scams to pints to guns is immense - and no, captchas don’t solve it.

    • Many of the provisions of the act apply to all user-to-user services, not just Schedule 1 and Schedule 2 services.

      For example, the site must have an "illegal content risk assessment" and a "children’s risk assessment". And the children's risk assessment is a four-dimensional matrix of age groups, types on content, ways of using the service and types of harm. And it's got to be updated before making any "significant" change to any aspect of a service’s design or operation. It also makes it mandatory to have terms of service, and to apply them consistently. The site must have a content reporting procedure, a complaints procedure, and maintain written records.

      Now obviously the operator of a bicycling forum might say "eh, let's ignore all that, they probably don't mean us"

      But if you read the law and interpret its words literally, a bicycling forum is a user-to-user service, and a public forum is almost certain to be read by children from time to time.