← Back to context

Comment by codazoda

7 months ago

I’m not so sure. It’s a layman’s interpretation, but I think any “forum” would be multi-risk.

That means you need to do CSAM scanning if you accept images, CSAM URL scanning if you accept links, and there’s a lot more than that to parse here.

I doubt it. While it's always a bit of a gray area, the example for "medium risk" is a site with 8M monthly users who share images, doesn't have proactive scanning and has been warned by multiple major organisations that it has been used a few times to share CSAM material.

Cases where they assume you should say "medium risk" without evidence of it happening are if you've got several major risk factors:

> (a) child users; (b) social media services; (c) messaging services; (d) discussion forums and chat rooms; (e) user groups; (f) direct messaging; (g) encrypted messaging.

Also, before someone comes along with a specific subset and says those several things are benign

> This is intended as an overall guide, but rather than focusing purely on the number of risk factors, you should consider the combined effect of the risk factors to make an overall judgement about the level of risk on your service

And frankly if you have image sharing, groups, direct messaging, encrypted messaging, child users, a decent volume and no automated processes for checking content you probably do have CSAM and grooming on your service or there clearly is a risk of it happening.

  • The problem is the following: if you don't have basic moderation your forum will be abused for those various illegal purposes

    Having a modicum of rule enforcement and basic abuse protections (let's say: new users can't upload files) on it goes a long way

That scanning requirement only applies if your site is:

• A "large service" (more than 7 million monthly active UK users) that is at a medium or high risk of image-based CSAM, or

• A service that is at a high risk of image-based CSAM and either has more than 700000 monthly active UK users or is a file-storage and file-sharing service.

> do CSAM scanning if you accept images, CSAM URL scanning if you accept links

Which really should be happening anyway.

I would strongly prefer that forums I visit not expose me to child pornography.

  • > I would strongly prefer that forums I visit not expose me to child pornography.

    While almost everybody including me shares this perference maybe it should be something that browsers could do? After all why put the burden on countless various websites if you can implement it in a single piece of software?

    This could also make it easier to go after people who are sources of such material because it wouldn't immediately disappear from the network often without a trace.

    • > While almost everybody including me shares this perference maybe it should be something that browsers could do? After all why put the burden on countless various websites if you can implement it in a single piece of software?

      If I recall correctly, Apple tried to do that and it (rightly) elicited howls of outrage. What you're asking for is for people's own computers to spy on them on behalf of the authorities. It's like having people install CCTV cameras their own homes so the police can make sure they're not doing anything illegal. It's literally Big Brother stuff. Maybe it would only be used for sympathetic purposes at first, but once the infrastructure is built, it would be a tempting thing for the authorities to abuse (or just use for goals that are not universally accepted, like banning all pornography).

      1 reply →

    • So basically you want your browser to be controlled by the governement and remove ones ability to use their browser of choice?

      All this because a negligible amount of web user upload CSAM?

      1 reply →