In memoriam

4 months ago (onlinesafetyact.co.uk)

Charlie Stross's blog is next.

Liability is unlimited and there's no provision in law for being a single person or small group of volunteers. You'll be held to the same standards as a behemoth with full time lawyers (the stated target of the law but the least likely to be affected by it)

http://www.antipope.org/charlie/blog-static/2024/12/storm-cl...

The entire law is weaponised unintented consequences.

  • > the stated target of the law but the least likely to be affected by it

    The least likely to be negatively affected. This will absolutely be good for them in that it just adds another item to the list of things that prevents new entrants from competing with them.

  • > The entire law is weaponised unintented consequences.

    That would assume no malice from the goverment? Isn't the default assumption that every government want to exert control over its population at this stage, even in "democracies"? There's nothing unintended here.

  • I thought that posts with comments are an explicit exception from the OSB.

    From Ofcom:

    > this exemption would cover online services where the only content users can upload or share is comments on media articles you have published

    • From the Ofcom regulations (https://www.ofcom.org.uk/siteassets/resources/documents/onli...):

      > 1.17 A U2U service is exempt if the only way users can communicate on it is by posting comments or reviews on the service provider’s own content (as distinct from another user’s content).

      A blog is only exempt if users communicate to the blogpost author, on the topic of the blogpost. If they comment on each other, or go off-topic, then the blog is not exempt.

      That's why that exemption is basically useless. Anyone can write "hey commenter number 3 i agree commenter number 1's behaviour is shocking" and your exemption is out the window.

      1 reply →

  • There has been new information since that blog post which has reaffirmed the "this is much ado about nothing" takes because Ofcom have said that they do not want to be a burden on smaller sites.

    https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...

    "We’ve heard concerns from some smaller services that the new rules will be too burdensome for them. Some of them believe they don’t have the resources to dedicate to assessing risk on their platforms, and to making sure they have measures in place to help them comply with the rules. As a result, some smaller services feel they might need to shut down completely.

    So, we wanted to reassure those smaller services that this is unlikely to be the case."

    • Nothing more reassuring than a vague “we’re unlikely to go after you [if you stay on our good side.]”

      It’s clear the UK wants big monopolistic tech platforms to fully dominate their local market so they only have a few throats to choke when trying to control the narrative…just like “the good old days” of centralized media.

      I wouldn’t stand in the way of authoritarians if you value your freedom (or the ability to have a bank account).

      The risk just isn't worth it. You write a blog post that rubs someone power-adjacent the wrong way and suddenly you're getting the classic "...nice little blog you have there...would be a shame to find something that could be interpreted as violating 1 of our 17 problem areas..."

      4 replies →

    • > So, we wanted to reassure those smaller services that this is unlikely to be the case

      This is the flimsiest paper thin reassurance. They've built a gun with which they can destroy the lives of individuals hosting user generated content, but they've said they're unlikely to use it.

    • "... unlikely ..."

      Political winds shift, and if someone is saying something the new government doesn't like, the legislation is there to utterly ruin someone's life.

    • You can try the digital toolkit and see for yourself if this is a realistic pathway for a small site (such as a blog with a comment function). Personally, I find it puzzling that Ofcom thinks what they provide is helpful to small sites. Furthermore, they make it pretty clear that they see no reason for a purely size-based exemption (“we also know that harm can exist on the smallest as well as the largest services”). They do not explore ways to reach their goals without ongoing collaboration from small site owners, either.

    • Ofcom need to change the law then.

      Unless Ofcom actively say "we will NOT enforce the Online Safety Act against small blogs", the chilling effect is still there. Ofcom need to own this. Either they enforce the bad law, or loudly reject their masters' bidding. None of this "oh i don't want to but i've had to prosecute this crippled blind orphan support forum because one of them insulted islam but ny hands are tied..."

    • The Canadian government did the same thing when they accidentally outlawed certain shotguns by restricting bore diameter without specifying it was for rifles.

      A minister tweeted that it didn’t apply to shotguns, as if that’s legally binding as opposed to you know, the law as written.

      8 replies →

    • The use of "unlikely" just screams that Ofcom will eventually pull a Vader..."We are altering the deal, pray we don't alter it any further".

    • "Unlikely," I suppose if you don't have any significant assets to be seized and don't care about ending up in prison, you may be willing to take the chance.

    • Nothing reassures one as much as a goverment enforcement entity essentially saying "we have full legal right to squash you like a bug but for now we won't because we just don't want to. For now".

  • Its very much intended. It's easier for the powers that be to deal with a few favored oligarchs. They're building a great British firewall like china.

  • What standards would you want individuals or small groups to be held to? In a context where it is illegal for a company to allow hate speech or CSAM on their website, should individuals be allowed to? Or do you just mean the punishment should be less?

    • The obvious solution is to have law enforcement enforce the law rather than private parties. If someone posts something bad to your site, the police try to find who posted it and arrest them, and the only obligation on the website is to remove the content in response to a valid court order.

      6 replies →

    • How about:

      Individuals and small groups not held directly liable for comments on their blog unless its proven they're responsible for inculcating that environment.

      "Safe harbour" - if someone threatens legal action, the host can pass on liability to the poster of the comment. They can (temporarily) hide/remove the comment until a court decides on its legality.

    • How about have separate laws for CSAM and "hate speech". Because CSAM is most likely just a fig-leaf for the primary motivation of these laws.

  • This is an honest question. Why does a blog need to shutdown? If they moderate every comment before it is published on the website, what's the problem? I ask because I've got a UK-based blog too. It has got comments feature. Wouldn't enabling moderation for all comments be enough?

    • No, you still need to do things like write an impact assessment etc and you're still on the hook for "illegal" comments where you aren't a judge and have to arbitrarily decide what might be when you have no legal expertise whatsoever.

      6 replies →

Doesn't this act effectively create a new form of DDoS? A bad actor can sufficiently flood a platform with enough hate content that the moderation team simply cannot keep up. Even if posts default to not show, the backlog could be enough to harm a service.

And of course, it will turn into yet another game of cat and mouse, as bad actors find new creative ways to bypass automatic censors.

Should order this list by number of affected rather than alphabetical IMO. The 275K monthly user platform is almost hidden relative to the 49 and 300 user examples.

You know what's really rich about the OSA?

One of the exemptions is for "Services provided by persons providing education or childcare."

I will host public proxied site for these websites, open for UK people, just to troll them :D

  • Whilst I don't condone being unlawful (are you sure you want to run that risk?), that's the hacker spirit one needs these days.

    Being silly to ridicule overreaching laws is top-trolling! Love it.

    • The trouble here is that the law is so crazy that third parties allowing users in the relevant jurisdiction to access the site could result in the site still be liable, so then they would have the same reason to block your proxy service if a non-trivial number of people were using it.

      To do any good you don't want to cause grief for the victims of the crazy law, you want to cause grief to its perpetrators.

      6 replies →

    • being unlawful is a vital tool for people to keep tyranny in check, I would hope that most people are incredibly strong supporters of lawlessness when the laws are wrong. To give an extreme example, I imagine you supported the hiding of jewish people during nazi germanys reign, which means you support unlawful activity as long as it's against laws that are against the people.

    • If GP is not a UK citizen and does not live in the UK, how would that be unlawful? They're not beholden to or subject to UK law. The UK's belief that they can enforce this law on non-UK entities is ridiculous.

Is Hacker News also affected by this act?

  • International law limits state jurisdiction to territorial boundaries (Art. 2(1) UN Charter). Hacker News is a US web site and Y Combinator LLC is a US company. The OSA, which is a UK law, cannot mandate physical enforcement (e.g., server seizures) on foreign soil. If they really didn't like HN, UK government could try to suppress HN access for their citicens by local means. If HN had a branch in the UK, the UK government could take action against that branch. As far as I know that's not the case.

  • Yes, but I don't really understand how the UK can expect to enforce this law against non-UK entities that don't have any employees or physical presence in the UK.

    HN/YC could just tell them to go pound sand, no? (Assuming YC doesn't have any operations in the UK; I have no idea.)

"Furry.energy"? With a total of 49 members? My World of Warcraft guild has more active players...

Right or wrong I think many have misread the legislation or read poor coverage of it given people's reasoning.

Much of things boils down to doing a risk assessment and deciding on mitigations.

Unfortunately we live in a world where if you allow users to upload and share images, with zero checks, you are disturbingly likely to end up hosting CSAM.

Ofcom have guides, risk assessment tools and more, if you think any of this is relevant to you that's a good place to start.

https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...

  • it's not that simple - illegal and harmful content can include things like hate speech - worth a longer read... https://www.theregister.com/2025/01/14/online_safety_act/

    If I ran a small forum in the UK I would shut it down - not worth risk of jail time for getting it wrong.

    • The new rules cover any kind of illegal content that can appear online, but the Act includes a list of specific offences that you should consider. These are:

          terrorism
          child sexual exploitation and abuse (CSEA) offences, including
              grooming
              image-based child sexual abuse material (CSAM)
              CSAM URLs
          hate
          harassment, stalking, threats and abuse
          controlling or coercive behaviour
          intimate image abuse
          extreme pornography
          sexual exploitation of adults
          human trafficking
          unlawful immigration
          fraud and financial offences
          proceeds of crime
          drugs and psychoactive substances
          firearms, knives and other weapons
          encouraging or assisting suicide
          foreign interference
          animal cruelty

      11 replies →

    • I might be falling for what I've read second-hand but isn't one of the issues that it doesn't matter where the forum is based, if you've got significant UK users it can apply to your forum hosted wherever. You've got to block UK users.

    • The good thing about forums is their moderation. It seems like mostly what the law covers is already enforced by most forums anyways.

      1 reply →

  • > Much of things boils down to doing a risk assessment and deciding on mitigations.

    So... paperwork, with no real effect, use, or results. And you're trying to defend it?

    I do agree with need something, but this is most definitely not the solution.

    • Putting in mitigations relevant to your size, audience and risk factors is not "no real effect".

      If you've never considered what the risks are to your users, you're doing them a disservice.

      I've also not defended it, I've tried to correct misunderstandings about what it is and point to a reliable primary source with helpful information.

  • > if you allow users to upload and share images

    On my single-user Fedi server, the only person who can directly upload and share images is me. But because my profile is public, it's entirely possible that someone I'm following posts something objectionable (either intentionally or via exploitation) and it would be visible via my server (albeit fetched from the remote site.) Does that come under "moderation"? Ofcom haven't been clear. And if someone can post pornography, your site needs age verification. Does my single-user Fedi instance now need age verification because a random child might look at my profile and see a remotely-hosted pornographic image that someone (not on my instance) has posted? Ofcom, again, have not been clear.

    It's a crapshoot with high stakes and only one side knows the rules.

    • > On my single-user Fedi server,

      Then you don't have a user to user service you're running, right?

      > And if someone can post pornography, your site needs age verification.

      That's an entirely separate law, isn't it?

      1 reply →

  • You're right. Plus, the overreactions have been walked back or solved in some cases, e.g: LFGSS is going to continue on as a community ran effort which will comply with the risk assessment requirements. Most of the shutdowns are on long-dead forums that have been in need of an excuse to shutter. The number of active users impacted by these shutdowns probably doesn't break 100.

So, how does all this apply to community discords, slacks, Matrix rooms, IRC chats, etc?

Is it discord's responsibility to comply, the admin/moderators, or all of the above?

  • Yes, at least for platforms like Discord, they bear the responsibility based on my non-lawyer reading of the plain English. YMMV, IANAL.

  • The hosting platform is responsible for compliance. For Discord or Slack it's easy, but for Matrix, it might be more fuzzy. Certainly the homeserver that is hosting a room would be responsible, but would other homeservers that have users who are members of the room also be responsible?

It seems like governments around the world are shifting their priorities away from their domestic economies.

  • Shifting their priorities towards stifling the speech of anyone who tries to complain about the domestic conditions.

Safety from dissent, for an authoritarian government. This is just weaponized "empathy".

I am part of a small specialist online technical community. We just moved it over to a Hetzner box in Germany and someone there is paying for it instead of it being hosted in the UK.

What are you going to do Ofcom?

  • If you live in the UK and can still be linked as an operator/organizer of the site (or if it's not you, other UK residents), can't they still come after you directly? I don't know about you, but I don't think running an online community would be worth huge fines to me.

    • There are no UK residents involved in the organisation or operation of it now even though we came up with it.

Apparently the law is dreadfully written. I was reading the lobste.rs thread and wow, it’s like they took a programming course in goto and it statements and applied it to the law…

  • I had the complete opposite impression from that thread. It seemed like people were politically motivation to interpret the law in a certain way, so they could act like they were being coerced.

    These closures are acts of protest, essentially.

    I agree with @teymour's description of the law. It is totally normal legislation.

    • Not only is this law terrible, there are several other laws like this that have existed for years.

      People saying criticism is politically motivated (ignoring the fact that this law was drafted by the Tories and passed by Labour...so I am not exactly clear what the imagined motivation might be) ignore the fact that the UK has had this trend in law for a long time and the outcome has generally been negative (or, at best, a massive waste of resources).

      Legislation has a context: if we lived in a country where police behaved sensibly, I could reasonably see how someone could believe this was sensible...that isn't reality though. Police have a maximalist interpretation of their powers (for example, non-crime hate incidents...there is no legislation governing their use, they are used regularly to "question the thinking" of people who write critical things about politicians, usually local, or the police...no appointed authority gave them this power, their usage his been questioned by ministers...they register hundreds of thousands of a year still).

      Btw, if you want to know how the sausage is made: security services/police want these laws, some event happens, and then there is a coordinated campaign with the media (the favour is usually swapped for leaks later) to build up "public support" (not actual support, just the appearance of support), meetings with ministers are arranged "look at the headlines"...this Act wasn't some organic act of legislative genius, it was the outcome of a targeted media campaign from an incident that, in factual terms, is unrelated with what the Act eventually became (if this sounds implausible, remember that May gave Nissan £30m on the back of SMMT organising about a week's worth of negative headlines, remember that Johnson brought in about 4m migrants off the back of about two days of briefing against him by a six-month old lobbying group from hotels and poultry slaughterhouses...this is actually how the govt works...no-one reads papers apart from politicians).

      Giving Ofcom this power, if you are familiar with their operations, is an act of literal insanity. Their budget has exploded higher (I believe near a quarter of a billion now). If you think tech companies are actually going to enforce our laws for us, you are wrong. But suggesting that Ofcom with their new legions of civil servants is supposed to the watchdog of online content...it makes no sense, it cannot be described as "totally normal" in a country other than China.

My unpopular opinion is that it is a long time coming. Smaller sites that do not have strong technical knowledge of security or IT maintenance are targets for bot nets and scammers to host on. It's the same as arguing that Health and Safety regulation harms small businesses because they have to ensure the safety of their employees and customers. Or that it's not coducive to small businesses to have employment law. Companies have had decades to switch to safe and secure online businesses, but the self-regulation never materialised. Site owners are not going to be arrested willy nilly. All that is needed is to show that you have the administration in place to deal with complaints from the publis and enquiries from OFCOM. If you host illegal content, you have to have someone around to deal with it. If you host hate speech then you have to deal with the consequences. nobody has been prosecuted for saying the government is rubbish, but they will be for advocating physical harm on people in our society. This will create digital management agencies that act as proxies to OFCOM, it may even create a cottage industry of remote working digital administrators. These changes should be enbraced as opportunity, and fought when needed, but this isn't anything but law enforcement. The conspiracy bunnies are hopping mad, citing some sort of tyrannical destruction of liberty, but it's not.

  • > This will create digital management agencies that act as proxies to OFCOM, it may even create a cottage industry of remote working digital administrators.

    So bullshit jobs that do nothing productive but are there for "compliance". I think we have enough of that, thanks.

What concept allows the UK to (attempt to) enforce this against non citizens whose business or life has no ties to their country? Plenty of small countries have odd censorship laws but have escaped similar legal hand wringing.

Seems like an overreaction in some of these. Perhaps the people running them were close to the edge and more mental burden just pushes them over it.

It's like local US news websites blocking European users over GDPR concerns.

  • Feel free to put a stop to it by buying liability insurance for all of these service providers, which you may have to persuade the underwriter should be free. ;-)

  • > It's like local US news websites blocking European users over GDPR concerns.

    I don't know if you said this sarcastically, but I have a friend in Switzerland who reads U.S. news websites via Web Archive or Archive IS exactly because of that.

    Accessing some of these news sites returns CloudFlare's "not available" in your region message or similar.

    • It's not just the EU; I'm in a poorer region outside the EU and seeing "not available in your region" is quickly becoming the norm. Site administrators try to cut down on bot traffic (scraping, vulnerability scanners, denial of service, etc) and block whole regions they're not interested in.

      Hell, we do that ourselves, but only for our own infrastructure that isn't expected to be used outside the county. Whitelisting your own country and blocking everything else cuts out >99% of scrapers and script kiddies.

    • No sarcasm. I totally understand why a local news website in the US would just block since its irrelevant for them any traffic from outside the country and they're have little resources. I don't judge them from blocking.

      Fact is that its very unlikely they would ever face any issues about having it not blocked.

      1 reply →