Forum with 2.6M posts being deleted due to UK Online Safety Act

4 months ago (forums.hexus.net)

What is the meaning of "illegal content" given in the OSA? What will social media platforms be forced to censor (, remove, ..) ... let's take a look:

Table 1.1: Priority offences by category ( https://www.ofcom.org.uk/siteassets/resources/documents/onli... )

Disucssion of offenses related to: prostitution, drugs, abuse & insults, suicide, "stiring up of racial/religious hatred", fraud and "foreign interference".

So one imagines a university student discussing, say: earning money as a prostitute. Events/memories related to drug taking. Insulting their coursemates. Ridiculing the iconography of a religion. And, the worst crime of all, "repeating russian propaganda" (eg., the terms of a peace deal) -- which russians said it, and if it is true are -- of course -- questions never asked nor answered.

This free-thinking university student's entire online life seems to have been criminalised in mere discussion by the OSA, there may have been zero actual actions involved (consider, though, a majority of UK students have taken class-A drugs at most prominent universities).

This seems as draconian, censorious, illiberal, repressive and "moral panic"y as the highs of repressive christian moralism in the mid 20th C.

  • We grew up with the internet being a fun place where fun things happen and you don't need to take it so seriously. It was the symbol of freedom. Then internet evolved into a business center, where everything is taken extremely seriously, don't you dare break the etiquette. It's a sad change to witness, but it is what it is.

    • It was once in a lifetime. Some things are best when not everybody (but especially lawyers) is aware of them.

      Maybe the future will be places guarded by real life trust.

  • I'm no fan of this act but your characterisation is highly misleading.

    To pick two examples from the document you linked:

    Discussion of being a sex worker would not be covered. The only illegal content relating to sex work would be if you were actively soliciting or pimping. From the document:

    * Causing or inciting prostitution for gain offence

    * Controlling a prostitute for gain offence

    Similarly, discussion of drug use wouldn't be illegal either per se, only using the forum to buy or sell drugs or to actively encourage others to use drugs:

    * The unlawful supply, offer to supply, of controlled drugs

    * The unlawful supply, or offer to supply, of articles for administering or preparing controlled drugs

    * The supply, or offer to supply, of psychoactive substances

    * Inciting any offence under the Misuse of Drugs Act 1971

    That's very different to criminalising content where you talk about being (or visiting) a prostitute, or mention past or current drug use. Those things would all still be legal content.

    • Those are indeed against the law. The issue is what these platforms are required to censor on behalf of these other laws.

      Recall that we just spent several years where discussion of major political issues of concern to society were censored across social media platforms. Taking an extremely charitable interpretation of what government demands will be made here isn't merely naïve but empirically false.

      And the reason I chose those kinds of illegal activities was to show that these very laws themselves are plausibly oppressive as-is, plausibly lacking in "deep democractic" support (ie., perhaps suriving on very thin majorities) -- and so on.

      And yet it is these laws for which mass interactive media will be censored.

      This is hardly a list with murder at the top.

  • > [..] as the highs of repressive christian moralism in the mid 20th C.

    What makes you pick the mid-20th century as the high point of repressive christian moralism? That doesn't seem even close to the high point if you look back further in history.

  • It is odd that stirring up hatred is fine, as long as it does not pertain to religion, race, sexual orientation

    • This is because use of this data could create significant risks to the individual’s fundamental rights and freedoms. For example, the various categories are closely linked with:

      - freedom of thought, conscience and religion; - freedom of expression; - freedom of assembly and association; - the right to bodily integrity; - the right to respect for private and family life; or - freedom from discrimination.

      1 reply →

  • Where does it say discussion of those offences is illegal content? It says "content that amounts to a relevant offence". Frustratingly that is nonsensical: content surely cannot "amount to an offence" in and of itself. Offences have elements, which fall into two categories: actus reus and mens rea. And "content" cannot be either. Perhaps posting some content or possessing some content is the actus reus of an offence but the content itself does not seem to me to sensibly be able to be regarded as "amounting to an offence" any more than a knife "amounts to an offence". A knife might be used in a violent offence or might be possessed as a weapons possession offence but it makes no sense to me to say that the knife "amounts to an offence".

    Either way, the point of that document in aggregate seems to be that "illegal content" is content that falls afoul of existing criminal law already: (possession and distribution of) terrorist training material is already illegal and so it is illegal content. But saying that you committed an offence is not, in and of itself, an offence, so saying you took drugs at university doesn't seem to me like it could be illegal content. Encouraging people to do so might be, but it already is.

    Maybe I missed the bit where it says discussing things is illegal, so correct me if I am wrong.

    Not your lawyer not legal advice etc etc

  • > This free-thinking university student's entire online life seems to have been criminalised in mere discussion by the OSA

    There's nothing illegal about hosting a forum. The problem is that you as the site operator are legally required to take down certain kinds of content if and when it appears. Small sites with no money or staff don't have the resources to pay for a full time moderator. That cost scales with the number of users. And who knows whats in those 2.6M historical posts.

    From TFA:

    > The act will require a vast amount of work to be done on behalf of the Forums and there is no-one left with the availability to do it

    Maybe an LLM can carry some of the load here for free forums like this to keep operating?

    • > Maybe an LLM can carry some of the load here for free forums like this to keep operating?

      It can't give you any guarantees, and it can't be held liable for those mistakes.

      2 replies →

  • All you need to do is have a think about what reasonable steps you can take to protect your users from those risks, and write that down. It's not the end of the world.

    • 1.36 Table 1.2 summarises the safety duties for providers of U2U services in relation to different types of illegal content. The duties are different for priority illegal content and relevant non-priority illegal content. Broadly they include:

      a) Duties to take or use proportionate measures relating to the design or operation of the service to prevent individuals from encountering priority illegal content and minimising the length of time that such content is present on the service;

      b) Duties to take or use proportionate measures relating to the design or operation of the service to design and operate systems in a way which mitigates and manages the risks identified in the service provider’s risk assessment;

      c) A duty to operate the service using proportionate systems and processes designed to swiftly take down (priority or non-priority) illegal content when they become aware of it (the ‘takedown duty’); and

      d) A duty to take or use proportionate measures relating to the design and operation of the service to mitigate and manage the risk of the service being used for the commission or facilitation of a priority offence

      ---

      That's a bit more than "have a think"

    • That is false. The post you replied to virtuously linked directly to the UK government's own overview of this law. Just writing down "reasonable steps" [1] is insufficient - you also have the following duties (quoting from the document):

      - Duties to take or use proportionate measures relating to the design or operation of the service to prevent individuals from encountering priority illegal content and minimising the length of time that such content is present on the service;

      - Duties to take or use proportionate measures relating to the design or operation of the service to design and operate systems in a way which mitigates and manages the risks identified in the service provider’s risk assessment;

      - A duty to operate the service using proportionate systems and processes designed to swiftly take down (priority or non-priority) illegal content when they become aware of it

      - A duty to take or use proportionate measures relating to the design and operation of the service to mitigate and manage the risk of the service being used for the commission or facilitation of a priority offence.

      - The safety duty also requires providers to include provisions in their terms of service specifying how individuals are to be protected from illegal content, and to apply these provisions consistently.

      Even if the language of this law was specific, it requires so many so invasive and difficult steps, no hobbyist, or even small company could reasonably meet. But it's anything but specific - it's full of vague, subjective language like "reasonable" and "proportionate", that would be ruinous to argue in court for anyone but billion dollar companies, and even for them, the end result will be that they are forced to accede to whatever demands some government-sanctioned online safety NGO will set, establishing a neverending treadmill of keeping up with what will become "industry standard" censorship. Because it's either that, or open yourself to huge legal risk that, in rejecting "industry standard" and "broadly recognized" censorship guidance to try to uphold some semblance of free discussion, you have failed to be "reasonable" and "proportionate" - you will be found to have "disregarded best practices and recognized experts in the field".

      But, short of such an obvious breach, the rules regarding what can and can't be said, broadcast, forwarded, analysed are thought to be kept deliberately vague. In this way, everyone is on their toes and the authorities can shut down what they like at any time without having to give a reason. [2]

      [1] Good luck arguing over what is "reasonable" in court if the government ever wants to shut you down.

      [2] https://www.bbc.com/news/world-asia-china-41523073

  • > "foreign interference"

    That is a very tricky one to manage on an online forum. If an American expresses an opinion about UK policy, in a literal sense that is literally foreign interference. There isn't a technical way to tell propagandists from opinionated people. And the most effective propaganda, by far, is that which uses the truth to make reasonable and persuasive points - if it is possible to make a point that way then that is how it will be done.

    The only way this works is to have a list of banned talking points from a government agency. I'd predict that effective criticism of [insert current government] is discovered to be driven mainly by foreign interference campaigns trying to promote division in the UK.

    This runs into the same problem as all disinformation suppression campaigns - governments have no interest in removing the stuff everyone agrees is untrue - what is the point? the flat earthers are never going to gain traction and it doesn't matter if they do - the only topics worth suppressing are things that are plausible and persuasive. The topics most likely to turn out to be true in hindsight.

    • > The only way this works is to have a list of banned talking points from a government agency.

      How so? The "obvious" solution to me, from the perspective of a politician, would be to 1. require online identity verification for signup to any forum hosted in your country, and then 2. using that information, only allow people who are citizens of your country to register.

      (You know, like in China.)

      2 replies →

    • The British legal system is a common law one like the U.S I believe, so it would be up to court interpretation.

      Foreign interference would probably be interpreted as an organized campaign of interference being launched by a foreign power.

      >This runs into the same problem as all disinformation suppression campaigns - governments have no interest in removing the stuff everyone agrees is untrue

      at one time everyone agreed Anti-Vaxx was untrue, and now it's American government policy but still just as untrue.

  • The legislation follows the general structure of the health and safety act a couple of decades ago. That also caused a big right wing press crisis, and then we all sort of moved on, did a bit more paperwork, and now fewer people die in factory accidents. It's really quite helpful to start practically implementing this stuff rather than philosophising about it.

    • Yeah it's all a series of no biggies. But one day citizens in your sinking ship of a country will be looking overseas at countries like Afghanistan in longing as they flip ends of the leaderboard with you.

  • > This seems as draconian, censorious, illiberal, repressive and "moral panic"y as the highs of repressive christian moralism in the mid 20th C.

    Given what has happened to the US as a result of unbridled free broadcast of misinformation and disinformation, we definitely need more "draconian, censorious, illiberal, repressive" rules around the propagation of such media.

    Moral panic is EXACTLY what's called for!

    You have captains of industry and thought leaders of the governing party throwing fucking nazi salutes, and this is broadcast to the masses! Insanity to defend free speech after the country is circling a drain as a result of said free speech.

  • [flagged]

    • The point the person you are responding to is saying is that even discussing this is, under his interpretation, "illegal content". Which presumably would make your comment illegal content. I am not sure I agree but either this law is very poorly communicated to the public, or it is batshit insane authoritarian nonsense, if that is what people are taking away from it, but either way it is a major fuckup.

      1 reply →

Related post with a large discussion from someone who said:

"Lfgss shutting down 16th March 2025 (day before Online Safety Act is enforced)

[...] I run just over 300 forums, for a monthly audience of 275k active users. most of this is on Linode instances and Hetzner instances, a couple of the larger fora go via Cloudflare, but the rest just hits the server.

and it's all being shut down [...]"

For the same reasons.

https://news.ycombinator.com/item?id=42433044

Anyone* would be crazy to run a UK-based or somewhat UK-centric forum today. Whether it be for a hobby, profession, or just social interaction. The government doesn’t perceive these sites as having any value (they don't employ people or generate corporation tax).

[*] Unless you are a multibillion $ company with an army of moderators, compliance people, lawyers.

  • Well I'm on a forum run by a UK company, hosted in the UK, and we've talked about this, but they're staying online. And, no, they're not a multibillion dollar company.

    I don't see our moderators needing to do any more work than they're already doing, and have been doing for years, to be honest.

    So we'll see how the dice land.

    • As long as they don't upset anyone with influence (government, media, etc.), they'll probably be fine. Otherwise, at best they'll be looking at a ruinously expensive legal battle to justify if what they did was "reasonable" or "proportionate" - the vague terms used by the law.

      For my friends, everything; for my enemies, the law.

    • At least they're a UK company though so presumably they've at least got some money to support this. If you're an individual running a hobby forum then you're SOL

  • more than just forums, it's basically a failed state now. I knew when I left (I was the last of my school year to do so) it was going to get bad once Elizabeth died, and that would be soon, but I never imagined it would get this bad.

    The plan for April is to remove the need for police to obtain a warrant to search peoples homes - that bad.

    I'd say "there will be blood on the streets", but there already is...

    This video pretty much sums up what the UK is now. https://m.youtube.com/watch?v=zzstEpSeuwU

    • No, the proposal is that there is a power of entry where the police have reasonable grounds to believe stolen property is on the premises and that this is supported by tracking data and that authority to enter is provided and recorded by a police inspector.

      This is analogous to s18 PACE post-arrest powers, grafted onto s17 PACE.

      The alternative is that we continue to require police to try and get a fast-time warrant while plotted up outside a premises; this is not a quick process, I've done it and it took nearly two hours.

      >there will be blood on the streets

      Oh, dry up.

      5 replies →

    • >I knew when I left (I was the last of my school year to do so) it was going to get bad once Elizabeth died

      How small was your school year?! What does Elizabeth (presumably the 2nd) dying have to do with anything?

      12 replies →

  • The opposite is true. The new law makes it considerably more risky for large companies because the law is specifically designed to hold them to account for conduct on their platforms. The (perceived) risk for small websites is unintended and the requirements are very achievable for small websites. The law is intended for and will be used to eviscerate Facebook etc. for their wrongs. We are far more likely to see Facebook etc. leave the UK market than we are see any small websites suffer.

    A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.

    • > A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.

      Facebook can actually train AI to detect CSAM, and is probably already doing so in cooperation with NCMEC and similar organisations/authorities across the world.

      Your average small website? No chance. Obtaining training material actively is seriously illegal everywhere, and keeping material that others upload is just as bad in most jurisdictions.

      The big guys get the toys, the small guys have to worry all the goddamn time if some pedos are going to use their forum or whatnot.

HEXUS stopped publishing in 2021, and the company no longer exists. The forums were kept because they don't take much work to keep online. Now, there's a lot of work to do, like reading hundreds of pages of documents and submitting risk assessments. There's nobody to do that work now, so the idea was it could go into read only mode. The problem with that was, some users may want their data deleted if it becomes read only. Therefore, the only option is to delete it.

  • Why don't they just anonymize the users? Discourse does this, and it's apparently GDPR compliant.

    • gdpr compliance depends a lot on who you ask, and only a court can make the final decision.

      Stripping all usernames out of a forum certainly makes it safer, but I don't think anyone can say there still won't be a few pissed off users who wrote things they now regret on there, and can be tracked back to individuals based on context/writing style alone.

Summary: The UK has some Online Safety Act, any websites that let users interact with other users has to police illegal content on its site and must implement strong age verification checks. The law applies to any site that targets UK citizens or has a substantial number of UK users, where "substantial number" is not defined.

I'm going to guess this forum is UK-based just based on all the blimey's. Also the forum seems to have been locked from new users for some time, so it was already in its sunset era.

The admin could just make it read only except to users who manually reach out somehow to verify their age, but at the same time, what an oppressive law for small UK forums. Maybe that's the point.

  • IANAL

    > any websites that let users interact with other users has to police illegal content on its site and must implement strong age verification checks.

    But I believe you only need age verification if pornography is posted. There's also a bunch of caveats about the size of user base - Ofcom have strongly hinted that this is primarily aimed at services with millions of users but haven't (yet) actually clarified whether it applies to / will be policed for, e.g., single-user self-hosted Fediverse instances or small forums.

    I don't blame people for not wanting to take the risk. Personally I'm just putting up a page with answers to their self-assessment risk questionnaire for each of my hosted services (I have a surprising number that could technically come under OSA) and hoping that is good enough.

    • I believe you only need age verification if pornography is posted

      But if you let users interact with other users, you're not in control of whether pornographic material is posted, so it's safer to comply beforehand.

      I commend you for keeping your site up and hoping for the best. I don't envy your position.

    • > Ofcom have strongly hinted that this is primarily aimed at services with millions of users but haven't (yet) actually clarified [...]

      This has echoes of the Snooper's Charter and Apple's decision to withdraw ADP from all of UK.

      It is not enough for regulators to say they won't anticipate to enforce the law against smaller operators. As long as the law is on the books, it can (and will) be applied to a suitable target regardless of their size.

      I saw this this same bullshit play out in Finland. "No, you are all wrong, we will never apply this to anything outside of this narrow band" -- only to come down with the large hammer less than two years later because the target was politically inconvenient.

  • It's for 7 million active UK users per month. https://www.ofcom.org.uk/siteassets/resources/documents/onli... - definition on page 64.

    That's quite sizeable. How many sites can you name have 7 million monthly active UK users? That's over one-in-ten of every man, woman and child in the UK every month using your site.

    • Yes, the actual draft doesn't really add many requirements to non "large" services, pretty much having a some kind of moderation system, have some way of reporting complains to that, and a filed "contact" individual. I note it doesn't require proactive internal detection of such "harmful" content that many people here seem to assume, just what they already have 'reason to believe' it's illegal content. Even hash-based CASM detection/blacklisted URLs isn't required until you're a larger provider or a file share product.

      It just seems like an overly formalized way of saying "All forums should have a "report" button that actually goes somewhere", I'd expect that to be already there on pretty much every forum that ever existed. Even 4chan has moderators.

Rather than shut it down, would it be possible to sell the forum to someone in the US for a little bit of money, like $20 or something?

Idea being the US-based owner migrates the DB with posts and user logins to servers hosted on US soil, then if the UK government comes knocking the former owners in the UK can say "Sorry it doesn't belong to us anymore, we sold it, here's the Paypal receipt." (Ideally they'd sell the domain too, but as long as you still have the DB you could always host the forum at a different domain.)

Any forum admins here willing to add another forum to their portfolio?

It's awkward.

It's clear this law affects terribly bona fide grassroots online communities. I hope HN doesn't start geoblocking the UK away!

But then online hate and radicalization really is a thing. What do you do about it? Facebook seems overflowing with it, and their moderators can't keep up with the flow, nor can their mental health keep up. So it's real and it's going to surface somewhere.

At some level, I think it's reasonable that online spaces take some responsibility for staying clear of eg hate speech. But I'm not sure how you match that with the fundamental freedom of the Internet.

  • You don't. "Hate speech" is code for "the government knows better and controls what you say."

    Yes, racism exists and people say hateful things.

    Hate speech is in the interpretation. The US has it right with the first amendment - you have to be egregiously over the line for speech to be illegal, and in all sorts of cases there are exceptions and it's almost always a case-by-case determination.

    Hateful things said by people being hateful is a culture problem, not a government problem. Locking people up because other people are offended by memes or shitposts is draconian, authoritarian, dystopian nonsense and make a mockery of any claims about democracy or freedom. Europe and the UK seem hellbent for leather to silence the people they should be talking with and to. The inevitable eventual blowback will only get worse if stifling, suppressing, and prosecuting is your answer to frustrations and legitimate issues felt deeply but badly articulated.

    • I see no reason why hate speech should be given the benefit of the doubt. And no, it's not because my government told me so, I have my own opinion, which is that freedom of speech ends where threats of violence appear.

      If you don't want it tolerated online, which I don't, you need some kind of legal statement saying so. Like a law that says, you can't do it, and websites can't just shrug their shoulders and say it's not their problem.

      I don't line this legislation as it seems to be excessive, but I disagree that the root issue it tries to address is a made up problem.

      EDIT it just struck me that in speech and otherwise, the US has a far higher tolerance for violence - and yes I do mean violence. Free speech is taken much further in the US, almost to the point of inciting violence. Liberal gun laws mean lots of people have them, logically leading to more people being shot. School shootings are so much more common, and it appears there is no widespread conclusion to restrict gun ownership as a result.

      Maybe that's a core difference. Europeans genuinely value lower violence environments. We believe all reasonable things can be said without it. That doesn't make this legislation good. But at least it makes sense in my head why some people glorify extreme free speech (bit of a tired expression in this age).

      23 replies →

    • But this is a very us centric view. The rest of the world doesn't tolerate people going around being violent because of the constitution.

    • How would you feel about receiving daily credible death threats to you and your family? Should that be tolerated too in the name of the first amendment?

      Point is, we must draw the line somewhere. It's never "everything goes". Tolerating intolerance always ends up reducing freedom of expression.

      Look at the US, the government is doing everything it can to shove trans people back in the closet, their voices are silenced and government websites are rewritten to remove the T in LGBT. By the very same people who abused "the first amendment" to push their hateful rhetoric further and further until it's become basically fine to do nazi salutes on live TV.

      "Free speech absolutism" is a mirage, only useful to hateful people who don't even believe in it.

      7 replies →

    • Hate speech is the thing that plays on the radio station that directly causes the mass graves of the Rwandan genocide. The physical call to violence is just the very last step in a long chain of escalating hate speech, but it is no more culpable than the preceding hate speech that created the environment where that physical call to violence is acted on.

      2 replies →

  • > But then online hate and radicalization really is a thing.

    I'm not trying to be edgy, but genuinely why do you care if someone says or believes something you feel is hateful? Personally I'm not convinced this is even a problem. I'd argue this is something that the government has been radicalising people in the UK to believe is a problem by constantly telling us how bad people hating things is. Hate doesn't cause any real world harm – violence does. And if you're concerned about violence then there's better ways to address that than cracking down on online communities.

    In regards to radicalisation, this is a problem imo. I think it's clear there is some link between terrorism and online radicalisation, but again, I'd question how big a problem this is and whether this is even right way to combat these issues... If you're concerned about things like terrorism or people with sexist views, then presumably you'd be more concerned about the tens of thousands of unvetted people coming into the country from extremist places like Afghanistan every year? It's not like online radicalisation is causing white Brits to commit terror attacks against Brits... This is obviously far more an issue of culture than online radicalisation.

    So I guess what I'm asking is what radicalisation are you concerned with exactly and what do you believe the real world consequences of this radicalisation are? Do you believe the best way to stop Islamic terrorism in the UK is to crack down on content on the internet? Do we actually think this will make any difference? I don't really see the logic in it personally even if I do agree that some people do hold strange views these days because of the internet.

  • Hate and radicalization are products of existential purposelessness. You can’t make them go away by preventing existentially purposeless people from talking to each other.

    • > You can’t make them go away by preventing existentially purposeless people from talking to each other.

      At least you can limit the speed of radicalization. Every village used to have their village loon, he was known and ignored to ridiculed. But now all the loons talk to each other and constantly reinforce their bullshit, and on top of that they begin to draw in the normies.

    • No, you can't, but also theres is no reason why the law about allow these to be up. Plenty of people have racist thoughts, and that's not illegal (thoughts in general aren't), but go print a bunch of leaflets inciting racist violence and that is illegal.

      I see this as an internet analogy.

      3 replies →

  • Governmental attempts to reduce "online hate" (however defined, as it is entirely subjective) are just going to make our problems worse.

  • > online hate and radicalization really is a thing

    People have always had opinions. Some people think other people's opinions are poor. Talking online was already covered by the law (eg laws re slander).

    Creating the new category of 'hate speech' is more about ensuring legal control of messages on a more open platform (the internet) in a way that wasn't required when newspapers and TV could be managed covertly. It is about ensuring that the existing control structures are able to keep broad control of the messaging.

  • Is it a thing?

    I mean we had the holocaust, Rwandan genocide and the transatlantic slave trade without the internet.

    The discovery, by the governing classes, that people are often less-than-moral is just as absurd as it sounds. More malign and insidious is that these governors think it is their job to manage and reform the people -- that people, oppressed in their thinking and association enough -- will be easy to govern.

    A riot, from time to time -- a mob -- a bully -- are far less dangerous than a government which thinks it can perfect its people and eliminate these.

    It is hard to say that this has ever ended well. It is certainly a very stupid thing in a democracy, when all the people you're censoring will unite, vote you out, and take revenge.

    • It is a thing for sure. How often it happens, I don't know.

      I read a number of stories about school children being cyber-bullied on some kind of semi-closed forum. Some of these ended in suicide. Hell, it uses to happen a lot on Facebook in the early days.

      I totally understand a desire to make it illegal, past a certain threshold. I can see how you start off legislating with this in mind, then 20 committees later you end up with some kind of death star legislation requiring every online participant to have a public key and court-attested age certificate, renewed annually. Clearly that's nonsense, but I do understand the underlying desire.

      Because without it, you have no recourse if you find something like this online. For action to be even available, there has to be a law that says it's illegal.

      3 replies →

    • I mean, is it impossible that the commodified web is a sufficient but not necessary condition for atrocities? "But we had the Holocaust without it!" Okay, nobody said the internet was THE cause of ALL atrocities, just that it's actively contributing to today's atrocities. I think your logic is a bit... wrong.

      1 reply →

  • Online hate is skyrocketing in large part because billionaires and authoritarian regimes are pumping in millions of dollars to uplift it. Let’s address this issue at its source.

  • UK is sensitive about verbal manners, that is 'of utmost importance' (among all the others of course), just to use one of the most popular phrase here. If you suffer some outrageous impact in your life and complain in bad manner you may be punished further some way, socially or even contractually. One example is the TOC of Natwest. They close your account immediately if your conduct is offensive or discriminatory towards the staff. What counts as offensive? That detail is not expanded. Cannot be. It is a bit worrisome for those paying attention being nice to others as well. How to do that exactly? Where is the limit nowadays or in that situation? It is often people get offended nowadays for example by looking at upsetting things, or could feel discriminated. The bbc.co.uk is flowing with articles of people felt very intensive about something unpleasant. Be very careful about your conduct or you bank will kick you out. We are not even talking about hatefulness or radicalization.

    • I once saw someone propose a national service where you are required to work in a customer-facing job for a year, and I think about that a lot.

Feels more and more like we're at the end of an era when it comes to the internet.

  • How so? This is just the UK. While the UK really does want to enforce this globally, they really have no enforcement power against non-UK citizens who do not reside in the UK.

    Certainly it's possible (and perhaps likely!) that the EU and US will want to copycat this kind of law, but until that happens, I think your alarm is a bit of an overreaction.

    • A lot of people who travel internationally occasionally transit through UK jurisdiction, such as a connection at LHR. This potentially places forum operators in personal legal jeopardy. Would the UK authorities really go after some random citizen of another country for this? Probably not, but the risk isn't zero.

    • USA has backdoor laws afaik. Sweden is targeting Signal to force them create a backdoor. And this is only from regular news, I'm not even reading infosec industry updates. All govts are targeting privacy tools and the clock is ticking for them. I'm only hoping that one day these fuckers will be targeted themselves via exploits they have forced on us.

    • First they came for the British And I did not speak out Because I was not British...

These UK laws might boost Tor usage.. let's hope something good will come from the full censorship political tyranny in Europe.

  • If enough people switch to Tor, then Tor will get banned. Technical solutions don’t fix bad policies.

    • If you're in a struggle against a hostile regime, you don't refuse to use the weapons available to you because they're not what will bring you final victory. You use whatever you can.

      1 reply →

    • Tor is pretty hard to block. I think that some sort of mixnet is pretty much the solution to all ISP/Government spying and censorship on the web as they make the law de-facto unenforcable

      1 reply →

  • I doubt it. I think these laws were made to herd users towards big tech's established platforms that are 'policed' by community guidelines deemed 'appropriate' and where the content is never more than a takedown request away.

    Welcome to the new internet.

    (and it's funny how everyone's yelling 'fascist' at whatever happens in the US instead)

  • The UK is not in Europe, which would otherwise impose human rights legal constraints on UK government legislation.

    • The UK is in Europe, it didn't suddenly break off and float away, it's just not part of the EU, there's a bunch of European countries that aren't in the EU

    • The UK is in Europe. What other continent would it be in?

      It isn't in the EU, but it is a member of the Council of Europe, which is why it is still a party to European Declaration of Human Rights and the European Court of Human Rights still hears appeals from the UK.

      No international agreement can ever or has ever been capable of imposing legal constraints on the British Parliament because it is absolutely sovereign.

    • The UK is a signatory to the european convention on human rights (hell it wrote it), despite what Farage and the the Mail convinced you of in 2016 this was unrelated to the EU

    • You know perfectly well that the UK is in Europe. Not necessary part of the EU, but Europe as a continent, yes.

I sympathize with the operators of these forums of course -- the UK Online Safety Act is poorly conceived.

HOWEVER.

Deleting their forums? "The act will require a vast amount of work to be done on behalf of the Forums and there is no-one left with the availability to do it." [1]

This is a false dichotomy. Put Cloudflare in front of the site, block UK traffic [2], and you're done. 5 minute job.

[1] https://forums.hexus.net/hexus-news/426608-looks-like-end-he...

[2] https://developers.cloudflare.com/waf/custom-rules/use-cases...

  • I don't know the detail here, but in many of the discussions I've seen the operators themselves are based in the UK, and that changes the calculus.

    • Yeah, GP is, to put it charitably, not understanding the situation.

      > About Us

      > HEXUS.net is the UK’s number one independent technology news and reviews website.

So sites will geoblock the uk and users will use VPN software. Ugh. More software layers, more waste. Also a problem that is solved by a layer of indirection.

Fear/risk is at work here. Government by clear guidance, not guesswork is needed. The word "unlikely" is doing too much lifting in the guidance. OFCOM need to hard clarity with the kind of detail to satisfy lawyers. OSB is sound in its aims, a fumbled hot potato in its long-long discussion, a hash of an implementation, and the explication/communication is a regurgitated dogs dinner. Normally our gov communication is very good. Why can't OFCOM write? I guess we all know any forum with more than a few members likely already has software and some basic policy settings to do this. Unclear guidance is making operators jumpy and afraid.

An opportunity for anyone with a transformer from "UK.GOV Hand-waving" -> forum_settings.json

So, what makes the UK Online Safety Act close the forum?

  • This list of requirements is excessive and nobody wants to read through endless documents and do endless risk assessments. https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...

    Children's access assessments - 32 pages

    Guidance on highly effective age assurance and other Part 5 duties - 50 pages

    Protecting people from illegal harms online - 84 pages

    Illegal content Codes of Practice for user-to-user services - 84 pages

  • Because the UK refuses to elaborate on who qualifies under the act, and the only "safe" way to operate a website that might hypothetically be used by someone in the UK is to simply not.

    The costs required to operate any website covered by this act (which is effectively all websites) is grossly excessive and there are either NO exceptions, or the UK has refused to explain who is excepted.

    • Couldn't they wait for some kind of inquiry from UK Gov and then closed the forum reactively if it was an unreasonable financial burden?

    • > The costs required to operate any website covered by this act (which is effectively all websites) is grossly excessive

      That depends what you count as the costs. If you're a small site[0] and go through the risk assessment[1], that's the only costs you have (unless pornography is involved in which case yes, you'll need the age verification bits.)

      [0] ie. you don't have millions of users

      [1] Assuming Ofcom aren't being deliberately misleading here.

Headline: 2.6M posts

Reality: the forum has negative 358 posts in the last month. The forum has negative ~2k posts over the last 12 months. The forum is so inactive that they’re deleting posts faster than creating them. 8 people have created accounts in the last year.

The forum has been long dead.

  • Apparently any piece of informational older than a year has no value to you?

    Thankfully you aren't writing the laws in my country.

    Creating a law that makes internet creators want to delete all historical record for fear of potential prosecution under extremely broad terms -- doesn't seem like it's in the interest of the greater good.

    • The law has absolutely nothing to do with historic content, it has no provisions for or relevance to content published decades ago. Even in the most cautious response to this law, there is no reason to take content offline.

      3 replies →

  • I'm not sure why you're comparing total posts to monthly new posts. The tragedy here is that 2.6 million posts, potentially full of great content, is being deleted.

    >The forum is so inactive that they’re deleting posts faster than creating them.

    They've been in read-only mode, more or less, for awhile. Primarily, again, due to the (at the time proposed, now passed) law.

    Not to mention, this comment is missing the forest for the trees. This is not the only forum or website to shutter operations in the wake of the UK Online Safety Act.

    • The forum has had less than 100k posts in the last 10 years.

      Forums and small websites have been killed off by changing consumer behaviour, the shift to big social media platforms. Using big numbers to suggest that the UK Online Safety Act is responsible for killing off these smaller independent websites is disingenuous.

      If you do the same exercise for the other forums, you’ll find they’re all long dead too.

      6 replies →

I've been working with OFCOM on implementing the requirements of this act. They seem reasonable, and what they are looking for is mostly table stakes. That said, I wouldn't want to live in or run a UGC business in the UK right now.

Could someone please shed any light on why simply geoblocking the UK in its entirety would not be sufficient for an average forum to avoid having to deal with the Act?

A lot of US websites initially geoblocked EU to avoid dealing with GDPR, for example.

  • In this particular case, the forum is UK-based ("HEXUS is a UK-based technology reporting and reviews website founded by David Ross in 2000")

    In other non-UK-based cases, geo-blocking is the answer being used by some people.

    Per https://geoblockthe.uk/, they state:

    "Luckily OFCOM (the UK Government department responsible for 'enforcement' of these new rules) have confirmed that blocking people in the UK from accessing your website is a perfectly legal and acceptable way to comply with the law.".

    • Would be great if services like e.g. Wikipedia would do exactly that.

      "This website is not available in the UK. Ask your representative about the UK Online Safety Act for more information".

      1 reply →

  • Other comments here have suspected its audience might be primarily UK-based, so geoblocking might not be the best option.

    I'm also not familiar with UK law, which may or may not deem that be a sufficient counter-measure against VPNs. Also, if the forum's operator is based in the UK this also might not be an option.

  • That doesn’t help a UK-based forum. But otherwise, the law doesn’t limit itself to the UK, so there is concern about what happens if you don’t comply with it and ever intend to visit the UK.

This is a major blow to non-profit communities. Which also means that only for profit will make sense of maintaining such platforms, which in itself is contradictory to what the proposal of the this act is.

I would ordinarily be upset about this but in this case it is probably for the best as there is unfortunately a lot of islamophobic content on the site

I wonder if closed forum would also fall under these laws. By closed forum I mean a forum where you can see posts only after signing in.

What features are lacking from vBulletin that prevents being compliant? I suspect some details are missing.

  • It's not necessarily a technological problem that software has to solve. There's a bunch of processes around reporting and age verification that have to be in place.

    • Ah I see. I guess that's where the current direction of law and I have parted ways. Had they focused on making laws requiring better parental controls support on all devices then server operators could add a single RTA header and be done with it. It's not a perfect technical solution but I believe it delegates the legal liability on the parents where it belongs. Parents could then simply consent for their kids to view whatever they want if they feel they are psychologically ready.

actual question, why bother? if they are domiciled in the UK, sell it to someone outside it or move the company elsewhere. let the britons kick and scream; the fun thing about the internet is they can't really do anything about it.

  • The company no longer exists and it doesn't make any money so it isn't worth anything.

    • someone owns the forum and it's still a big archive of stuff. backlinks pointing there, old info, can run ads so it should cashflow somehow.

Forgive me if I’m being dense…

I just read through the entire HN discussion about lobste.rs and continued down that rabbit hole to other discussions of forum, deletions, and the safety act, etc.

The part I don’t understand is: Why aren’t these operators placing the forum into a corporate or partnership entity, without personal liability, that would be the target of some eventual enforcement?

These very small forums Are almost certainly not going to be targeted for enforcement… The issue is simply the risk…

… So why not just incorporate, go on your merry way, and if enforcement goes very differently than we all assume then you walk away from a corporate entity And continue to vacation in London without fear of arrest.

What am I missing here?

  • The corporate veil can be pierced. https://en.wikipedia.org/wiki/Piercing_the_corporate_veil. From the Wikipedia page, it sounds like this is most common in the United Kingdom when a corporation "is established to avoid an existing obligation" - which is exactly what you're suggesting. I am not licensed to practice law anywhere, but particularly not in the United Kingdom, where I have never been.

    To my understanding working at American non-profits, however, corporations are most helpful as a liability shield when they are clearly distinct entities, with distinct goals, and distinct decision making. In practice, that means having multiple people, writing some sort of charter / statement of purpose, and having quarterly meetings of a board of directors with quorums where notes are written and votes are taken. This can all be a fair bit of work, where before nothing was required.

  • In addition to the "you're not fooling anyone" bit about piercing the corporate veil that's already well-addressed, you missed provisions of the OSA.

    There's a requirement to name a senior manager: https://www.legislation.gov.uk/ukpga/2023/50/section/103

    There's personal liability attached to being the named senior manager: https://www.legislation.gov.uk/ukpga/2023/50/section/110

    Other nearby sections have additional personal liabilities. Like Sec 109 (5) (a) probably criminalizes your exact suggested response of walking away from Ofcom's inquiries: https://www.legislation.gov.uk/ukpga/2023/50/section/109 It depends on the legal definition of "permits the suppression of... any information required". We'd have to hire a UK lawyer for a confident answer.

    All this is a couple sentences in the law. The law is 250 pages long. Ofcom's guidance was rounding 3,000 pages the last time I counted.

    If you want to understand the OSA, I think the most accessible and valid writing available is by Neil Brown: https://onlinesafetyact.co.uk/ There's a lot developing as Ofcom continues to publish new rules and ignore questions, so I suggest reading the 'Replies' tab of his fediverse account.

    • "There's personal liability attached to being the named senior manager ..."

      Thank you - that's a missing piece that helps.

      "In addition to the "you're not fooling anyone" bit ..."

      I don't suggest a corporate veil as a ruse - it's a tool that has a function and I think this is certainly it.

      My sense is that enforcement for small operators is unlikely but the potential liabilities skew the risk dramatically. Pointing the initial enforcement at a corporate entity could change that risk assessment.

Some kinda online safety law is probably needed (download button is just up there if you must :/) but there should be a carve out for small operations. Set a revenue minimum or something.

  • The Act does in fact scale the obligations according to the size of the community/service.

    • But size and revenue/budget are different things entirely. Does it scale down to pocket change for large community forums with no commercial backing?

i can feel them coming for porn and as someone who is ugly poor and has no interest in interacting with real humans me so sad

  • I think you're right. The elites want infinite population growth and we're not doing our part, so they're slowly turning the screws.

Couldn't it be held in trust in the US or something?

"Just shut it down" is the lazy thing to do. Should take tips from dissidents in other totalitarian shitholes - they just move it abroad to relatively free countries.

Heavily editorialized headline here. Just as accurate: "Forum with 2.6M posts being deleted due to insufficient moderation"

  • It already has moderators. But they'd need to know the details about the 17 types of illegal harm too. And someone would have to submit a yearly risk assessment and contact information to the regulator. And there's a children's access assessment. And there'd need to be a complaints procedure. Oh and a children's risk assessment. Plus whatever else is contained within the hundreds or thousands of pages of guidance.