Comment by IanCal
4 months ago
Right or wrong I think many have misread the legislation or read poor coverage of it given people's reasoning.
Much of things boils down to doing a risk assessment and deciding on mitigations.
Unfortunately we live in a world where if you allow users to upload and share images, with zero checks, you are disturbingly likely to end up hosting CSAM.
Ofcom have guides, risk assessment tools and more, if you think any of this is relevant to you that's a good place to start.
https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...
it's not that simple - illegal and harmful content can include things like hate speech - worth a longer read... https://www.theregister.com/2025/01/14/online_safety_act/
If I ran a small forum in the UK I would shut it down - not worth risk of jail time for getting it wrong.
The new rules cover any kind of illegal content that can appear online, but the Act includes a list of specific offences that you should consider. These are:
> hate
Is it really just listed as one word? What's the legal definition of hate?
7 replies →
> hate
which is an umbrella term for everything that the government does not like right now, and does not mind jailing you for. In other words, it's their way to kill the freedom of expression.
From that list I don't see HN being affected, although I read somewhere that a report button on user generated content was required to comply for smaller sites.
1 reply →
I might be falling for what I've read second-hand but isn't one of the issues that it doesn't matter where the forum is based, if you've got significant UK users it can apply to your forum hosted wherever. You've got to block UK users.
The good thing about forums is their moderation. It seems like mostly what the law covers is already enforced by most forums anyways.
A forum that merely has good moderation is not automatically compliant with the act. It requires not just doing things, but paperwork that shows that you are doing things. The effort to do this well enough to be sure you will be in compliance is far beyond what is reasonable to ask of hobbyists.
[flagged]
There are openly transphobic MPs and as far as I know there aren't any laws criminalizing transphobic hate speech. What more do you want?
1 reply →
> Much of things boils down to doing a risk assessment and deciding on mitigations.
So... paperwork, with no real effect, use, or results. And you're trying to defend it?
I do agree with need something, but this is most definitely not the solution.
Putting in mitigations relevant to your size, audience and risk factors is not "no real effect".
If you've never considered what the risks are to your users, you're doing them a disservice.
I've also not defended it, I've tried to correct misunderstandings about what it is and point to a reliable primary source with helpful information.
> if you allow users to upload and share images
On my single-user Fedi server, the only person who can directly upload and share images is me. But because my profile is public, it's entirely possible that someone I'm following posts something objectionable (either intentionally or via exploitation) and it would be visible via my server (albeit fetched from the remote site.) Does that come under "moderation"? Ofcom haven't been clear. And if someone can post pornography, your site needs age verification. Does my single-user Fedi instance now need age verification because a random child might look at my profile and see a remotely-hosted pornographic image that someone (not on my instance) has posted? Ofcom, again, have not been clear.
It's a crapshoot with high stakes and only one side knows the rules.
> On my single-user Fedi server,
Then you don't have a user to user service you're running, right?
> And if someone can post pornography, your site needs age verification.
That's an entirely separate law, isn't it?
> Then you don't have a user to user service you're running, right?
"The Act’s duties apply to search services and services that allow users to post content online or to interact with each other."[0]
My instance does allow users (me) to post content online and, technically, depending on how you define "user", it does allow me to interact with other "users". Problem is that the act and Ofcom haven't clearly defined what "other users of that service" means - a bare reading would interpret it as "users who have accounts/whatever on the same system", yes, and that's what I'm going with but it's a risk if they then say "actually, it means anyone who can interact with your content from other systems"[2] (although I believe they do have a carve out for news sites, etc., re: "people can only interact with content posted by the service" which may also cover a small single-user Fedi instance. But who knows? I certainly can't afford a lawyer or solicitor to give me guidance for each of my servers that could fall under OSA - that's into double digits right now.)
> That's an entirely separate law, isn't it?
No, OSA covers that[1]
[0] https://www.gov.uk/government/publications/online-safety-act...
[1] https://www.ofcom.org.uk/online-safety/protecting-children/i...
[2] "To be considered a user of a user-to-user service for a month, a person doesn’t need to post anything. Just viewing content on a user-to-user service is enough to count as using that service." from https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...
You're right. Plus, the overreactions have been walked back or solved in some cases, e.g: LFGSS is going to continue on as a community ran effort which will comply with the risk assessment requirements. Most of the shutdowns are on long-dead forums that have been in need of an excuse to shutter. The number of active users impacted by these shutdowns probably doesn't break 100.