Lfgss shutting down 16th March 2025 (day before Online Safety Act is enforced)

6 months ago (lfgss.com)

figured this might be interesting... I run just over 300 forums, for a monthly audience of 275k active users. most of this is on Linode instances and Hetzner instances, a couple of the larger fora go via Cloudflare, but the rest just hits the server.

and it's all being shut down.

the UK Online Safety Act creates a massive liability, and whilst at first glance the risk seems low the reality is that moderating people usually provokes ire from those people, if we had to moderate them because they were a threat to the community then they are usually the kind of people who get angry.

in 28 years of running forums, as a result of moderation I've had people try to get the domain revoked, fake copyright notices, death threats, stalkers (IRL and online)... as a forum moderator you are known, and you are a target, and the Online Safety Act creates a weapon that can be used against you. the risk is no longer hypothetical, so even if I got lawyers involved to be compliant I'd still have the liability and risk.

in over 28 years I've run close to 500 fora in total, and they've changed so many lives.

I created them to provide a way for those without families to build families, to catch the waifs and strays, and to try to hold back loneliness, depression, and the risk of isolation and suicide... and it worked, it still works.

but on 17th March 2025 it will become too much, no longer tenable, the personal liability and risks too significant.

I guess I'm just the first to name a date, and now we'll watch many small communities slowly shutter.

the Online Safety Act was supposed to hold big tech to account, but in fact they're the only ones who will be able to comply... it consolidates more on those platforms.

Is there some generalized law (yet) about unintended consequences? For example:

Increase fuel economy -> Introduce fuel economy standards -> Economic cars practically phased out in favour of guzzling "trucks" that are exempt from fuel economy standards -> Worse fuel economy.

or

Protect the children -> Criminalize activites that might in any way cause an increase in risk to children -> Best to just keep them indoors playing with electronic gadgets -> Increased rates of obesity/depression etc -> Children worse off.

As the article itself says: Hold big tech accountable -> Introduce rules so hard to comply with that only big tech will be able to comply -> Big tech goes on, but indie tech forced offline.

  • > Introduce rules so hard to comply with that only big tech will be able to comply

    When intentional, this is Regulatory Capture. Per https://www.investopedia.com/terms/r/regulatory-capture.asp :

    > Regulation inherently tends to raise the cost of entry into a regulated market because new entrants have to bear not just the costs of entering the market but also of complying with the regulations. Oftentimes regulations explicitly impose barriers to entry, such as licenses, permits, and certificates of need, without which one may not legally operate in a market or industry. Incumbent firms may even receive legacy consideration by regulators, meaning that only new entrants are subject to certain regulations.

    A system with no regulation can be equally bad for consumers, though; there's a fine line between too little and too much regulation. The devil, as always, is in the details.

  • It's called "Perverse incentive" and Wikipedia runs an illustrative set of examples:

    https://en.wikipedia.org/wiki/Perverse_incentive

    • This one is marvelous: In 2021, the US Congress enacted stringent requirements to prevent sesame, a potential allergen, from cross-contaminating other foods. Many companies found it simpler and less expensive to instead add sesame directly to their product as an ingredient, exempting them from complying with the law.

  • There's the Cobra Effect popularized by Freakonomics

    Too many cobras > bounty for slain cobras > people start breeding them for the bounty > law is revoked > people release their cobras > even more cobras around

  • Why do people foolishly claim these are unintended consequences?

    This is a way to regulate political speech and create a weapon to silence free speech online. It's what opponents to these measures have been saying forever. Why do we have to pretend those enacting them didn't listen, are naive, or are innocent well intentioned actors? They know what this is and what it does. The purpose of a system is what it does.

    Related to this, and one version of a label for this type of silencing particularly as potentially weaponized by arbitrary people not just politicians is Heckler's veto. Just stir up a storm and cite this convenient regulation to shut down a site you don't like. It's useful to those enacting these laws that they don't even themselves have to point the finger, disgruntled users or whoever will do it for them.

  • Politicians should take a mandatory one-week training in:

    - very basic macro economics

    - very basic game theory

    - very basic statistics

    Come to think of it, kids should learn this in high school

    • I think you’re being overly charitable in thinking this happens because they don’t understand these things. The main thing is that they don’t care. The purpose of passing legislation to protect the children isn’t to protect the children, it’s to get reelected.

      If we can get the voters to understand the things you mention, then maybe we’d have a chance.

      43 replies →

    • Politicians forced to learn statistics -> Politicians better prepared to understand consequences of their actions -> Politicians exploit economy better -> Everyone worse off -> Law to educate politicians is abolished -> Politicians exploit economy nevertheless

      Seriously, the problem is not politicians being clueless about all the above, but having too much power which makes them think they need to solve everything.

      1 reply →

    • It is difficult to get a man to understand something when his re-election depends on him not understanding it.

    • You are assuming they work for the good of the country, but in reality they work for big corporations. These regulations are designed to weed out small players that are a nuisance for the rich.

    • You have it backwards.

      Politicians can be very very good at those things, when they have a reason to be.

  • There also is a very simple, uncontrived effect. You put pressure to a thing, the thing is quashed and ceased to exist.

    Many things in a society exist on thin margins, not only monetary, but also of attention, free time, care and interest, etc. You put a burden, such as a regulation, saying that people have to either comply or cease the activity, and people just cease it, like in the post. What used to be a piece of flourishing (or festering, depending on your POV) complexity gets reduced to a plain, compliant nothing.

    Maybe that was the plan all along.

  • > Is there some generalized law (yet) about unintended consequences?

    These are not unintended consequences. All media legislation of late has been to eliminate all but the companies that are largest and closest to government. Clegg works at Facebook now, they'd all be happy to keep government offices on the premises to ensure compliance; they'd even pay for them.

    Western governments are encouraging monopolies in media (through legal pressure) in order to suppress speech through the voluntary cooperation of the companies who don't want to be destroyed. Those companies are not only threatened with the stick, but are given the carrots of becoming government contractors. There's a revolving door between their c-suites and government agencies. Their kids go to the same schools and sleep with each other.

  • Sociologist Robert K. Merton coined the term "unintended consequences" (amongst numerous others), and developed an existing notion of manifest vs. latent functions and dysfunctions.

    In particular, Merton notes:

    Discovery of latent functions represents significant increments in sociological knowledge .... It is precisely the latent functions of a practice or belief which are not common knowledge, for these are unintended and generally unrecognized social and psychological consequences.

    Robert K. Merton, "Manifest and Latent Functions", in Wesley Longhofer, Daniel Winchester (eds) Social Theory Re-Wired, Routledge (2016).

    <https://www.worldcat.org/title/social-theory-re-wired-new-co...>

    More on Merton:

    <https://en.wikipedia.org/wiki/Robert_K._Merton#Unanticipated...>

    Unintended consequences:

    <https://en.wikipedia.org/wiki/Unintended_consequences#Robert...>

    Manifest and latent functions:

    <https://en.wikipedia.org/wiki/Manifest_and_latent_functions_...>

  • This is what Javier Milei means when he says that everything politicians touch turns to shit and therefor government should be minimal.

    • Isn’t that a case of throwing the baby out with the bathwater? Many regulations serve to protect individuals and the environment, both of which might otherwise be overlooked in favor of corporate profits fighting in the free market. I'm afraid that when advocates of minimal government push their agenda, they often envision a level of reduction far beyond what most people would find acceptable. In situations like the one under discussion, I believe improving the regulation would be a better approach than eliminating it entirely.

    • While I agree with your general sentiment I think that there is a possible type of government where we are no-longer forced to vote for individual humans (or indeed groups of humans: political parties) but can instead vote on the actual ideas/policies.

      It might even be possible now to combine nuanced perspectives/responses to proposed policies from millions of people together!? I think it's not that unreasonable to suggest that kind of thing nowadays, I think there's precedent for it too even though stuff like how-wikipedia-works isn't really ideal, (even though it's somewhat an example of the main idea!).

      This way, the public servants (including politicians) can mainly just take care of making sure the ideas that the people vote-for get implemented! (like all the lower tiers of government currently do - just extend it to the top level too!) I don't think we should give individuals that power any more!

      1 reply →

    • Cynical viewpoint, downvote if you must: It is the dream of right wing populists everywhere to demolish government bloat, leaving just the bits that are actually useful.

      But: https://www.inf.ed.ac.uk/teaching/courses/seoc2/1996_1997/ad...

      Any bureaucracy evolves, ultimately, to serve and protect itself. So the populist boss snips at the easy, but actually useful parts: Social safety nets, environmental regulations, etc. Whereas the core bureaucracy, the one that should really be snipped, has gotten so good at protecting itself that it remains untouchable. So in the end the percentage of useless administratium is actually up, and the government, as a whole, still bloated but even less functional. Just another "unintended consequences" example.

      We'll see if Argentina can do better than this.

      1 reply →

  • The concept of Rule Beating from Systems Thinking seems apt. You have some goal so you introduce a rule, but if you choose a bad rule, it ends up making things worse. The solution is to recognize that it was a bad rule, repeal it, and find a better one.

  • It is also that big business can influence legislators, and small business cannot, so big business can influence regulation to their own advantage.

  • I mean, that’s what I call “rules lawyering” in game parlance. When someone utilizes the rules in such a way as to cause legal harm in service of their own interests, regardless of the intent of said rules in preventing harm.

    It’s why when a law/rule/standard has a carveout for its first edge case, it quickly becomes nothing but edge cases all the way down. And because language is ever-changing, rules lawyering is always possible - and governments must be ever-resistant to attempts to rules lawyer by bad actors.

    Modern regulations are sorely needed, but we’ve gone so long without meaningful reform that the powers that be have captured any potential regulation before it’s ever begun. I would think most common-sense reforms would say that these rules should be more specific in intent and targeting only those institutions clearing a specific revenue threshold or user count, but even that could be exploited by companies with vast legal teams creating new LLCs for every thin sliver of services offered to wiggle around such guardrails, or scriptkiddies creating millions of bot accounts with a zero-day to trigger compliance requirements.

    Regulation is a never-ending game. The only reason we “lost” is because our opponent convinced us that any regulation is bad. This law is awful and nakedly assaults indietech while protecting big tech, but we shouldn’t give up trying to untangle this mess and regulate it properly.

    • > I would think most common-sense reforms would say that these rules should be more specific in intent and targeting only those institutions clearing a specific revenue threshold or user count, but even that could be exploited by companies with vast legal teams creating new LLCs for every thin sliver of services offered to wiggle around such guardrails, or scriptkiddies creating millions of bot accounts with a zero-day to trigger compliance requirements.

      This is what judges are for. A human judge can understand that the threshold is intended to apply across the parent company when there is shared ownership, and that bot accounts aren't real users. You only have to go back and fix it if they get it wrong.

      > The only reason we “lost” is because our opponent convinced us that any regulation is bad. This law is awful and nakedly assaults indietech while protecting big tech, but we shouldn’t give up trying to untangle this mess and regulate it properly.

      The people who passed this law didn't do so by arguing that any regulation is bad. The reason you lost is that your regulators are captured by the incumbents, and when that's the case any regulation is bad, because any regulation that passes under that circumstance will be the one that benefits the incumbents.

      1 reply →

  • >Protect the children -> Criminalize activites that might in any way cause an increase in risk to children -> Best to just keep them indoors playing with electronic gadgets -> Increased rates of obesity/depression etc -> Children worse off.

    Not sure how keeping kids off the internet keeps them indoors? Surely the opposite is true?

  • Your first example is a case of lawmakers not willing to finish the job moreso than of regulation being bad.

    That is like saying "when we write software there are bugs, so rather than fix them, we should never write software again".

    Your second example is ascribing to regulation something that goes way beyond regulation.

  • I don't think anyone believe that the "think of the children" argument leads to "unintended" consequences. They are thoroughly intended. It doesn't look like that, but policy makers do analyze potential impact and this is a problem you understand if you are more than 5 minutes into the topic.

    Although I do think they overlook that their legislation is restricted to their domestic market though, so any potential positive effect is more or less immediately negated. That is especially true for English speaking countries.

  • These can not be unintended consequences. Obviously the UK government is aware of what they are doing and are using whatever language they can.

  • Why are gas-guzzling trucks exempt from fuel standards? (Genuine question)

    • They were supposedly commercial vehicles with real need for size and towing capacity.

      Because no one would fork over stupid amounts of money for a f*k off big truck if they didn't have a real need. Right?

    • Because they were a large fraction of cars manufactured by US companies, so not excluding them would have put the entire US auto industry out of business.

    • The original idea was that they needed big engines and bad aerodynamics to be able to perform their functions of hauling bulky loads and towing heavy trailers. Few people who didn't actually have those needs would want to drive one because they were unwieldy to drive and uncomfortable to be in relative to cars, so such an exemption surely wouldn't be widely exploited.

  • >Increase fuel economy -> Introduce fuel economy standards -> Economic cars practically phased out in favour of guzzling "trucks" that are exempt from fuel economy standards -> Worse fuel economy.

    tl;dr: This is a myth.

    There is no incentive to the consumer to purchase a vehicle with worse fuel economy.

    There USED to be an incentive, 30-40 years ago.

    It is not 1985 anymore.

    The gas guzzler tax covers a range of fuel economies from 12.5 to 22.5 mpg.

    It is practically impossible to design a car that gets less than 22.5 mpg.

    The Dodge Challenger SRT Demon 170, with an 6.2 L 8 cylinder engine making ONE THOUSAND AND TWENTY FIVE horsepower is officially rated for 13 mpg but that's bullshit, it's Dodge juicing the numbers just so buyers can say "I paid fifty-four hundred bucks gas guzzler tax BAYBEE" and in real-world usage the Demon 170 is getting 25 mpg. Other examples of cars that cannot achieve 22.5 mpg are the BMW M2/M3/M4/M8, the Cadillac CT5, high-performance sports sedans for which the gas guzzler tax is a <5% price increase. ($5400 is 5% of the Demon 170 price, but 2-3% of what dealers are actually charging for it.)

    The three most popular vehicles by sales volume in the United States are: 1. The Ford F-150, 2. The Chevy Silverado, and 3. The Dodge Ram 1500.

    The most popular engine configuration for these vehicles is the ~3L V6. Not a V8. A V6.

    Less than 1/4th of all pickup trucks are sold equipped with a V8.

    According to fueleconomy.gov every single Ford, Chevrolet, and Ram full-size pickup with a V6 would pay no gas guzzler tax.

    Most V8s would be close, perhaps an ECU flash away, to paying no gas guzzler tax. The only pickups that would qualify for a gas guzzler tax are the high-performance models-- single-digit percentages of the overall sales volume and at those prices the gas guzzler tax would not even factor into a buyer's decision.

    People buy trucks, SUVs, and compact SUVs because they want them and can afford them.

    Not because auto manufacturers phased out cars due to fuel economy standards. Not because consumers were "tricked" or "coerced". And certainly not because "the gubmint" messed things up.

    They buy them because they WANT them.

    The Toyota RAV4 is the 4th most popular car in the US. The Corolla is the 13th most popular. They are built on the same platform and dimensionally, the Corolla is actually very slightly larger except for height. They both come with the same general ballpark choices in engines. The gas guzzler tax only applies to the Corolla, but that doesn't matter because they both would be exempt. People don't freely choose the RAV4 over the Corolla because of fuel economy they buy it because the Corolla has 13 cubic feet of cargo capacity and the RAV4 has 70 cubic feet.

    And before anyone says that the gas guzzler tax made passenger cars more expensive, passenger cars can be purchased for the same price adjusted for inflation they could be 50 years ago, but people don't want a Mitsubishi Mirage, which is the same price as a vintage VW Beetle (perennial cheapest new car from the 1960s) and better in every quantifiable metric, they want an SUV.

    What may be true is that there is a national policy to keep fuel prices as low as possible, for a myriad of reasons, with one side effect of that policy being that it has enabled people to buy larger less fuel-efficient cars.

    I do not believe it is auto manufacturers who are pushing for this policy. I believe it is the freight and logistic market. The auto market is valued at $4 billion, the freight and logistics market is $1,300 billion. GM and Ford are insignificant specks compared to the diesel and gasoline consumers of the freight and logistics firms (who have several powerful lobbies).

    https://www.thetruthaboutcars.com/2017/08/v8-market-share-ju...

    https://www.fueleconomy.gov

    https://www.irs.gov/pub/irs-pdf/f6197.pdf (gas guzzler worksheet)

    • Per https://assets.publishing.service.gov.uk/media/61b7e040e90e0... the average UK car MPG is ~50mpg, so even allowing for the difference in US and UK gallons a 22.5mpg vehicle is colloquially a "gas guzzler" by our standards.

      > What may be true is that there is a national policy to keep fuel prices as low as possible, for a myriad of reasons, with one side effect of that policy being that it has enabled people to buy larger less fuel-efficient cars.

      Yes. Americans have always had cheap fuel and it's shaped the entire society around it.

      1 reply →

    • People love to blame government regulations for consumer preferences that go against their own.

      Consumers want larger vehicles, and manufactures bend the rules to allow for such vehicles to be more easily build. Manufactures write the laws, after all. CAFE allows for SUVs and other "light trucks" to get worse fuel economy than a car. Since fuel economy allowances are based on vehicle footprint, and its easier to make a car larger than it is to improve fuel economy.

    • But ... Why do they want to? I'm genuinely curious. Did this desire for larger vehicles exist latent in the human psyche? Is it an emergent property of a race to the bottom as everyone tries to have the safest car? Or to secure prestige via a positional good, leaving everyone worse off? Do you think marketing choices played a role in shaping our collective desires?

      2 replies →

    • > There is no incentive to the consumer to purchase a vehicle with worse fuel economy.

      Not true: Section 179 [0]. Luxury auto manufacturers are well-aware of this [1] and advertise it as a benefit. YouTube et al. are also littered with videos of people discussing how they're saving $X on some luxury vehicle.

      > Not because consumers were "tricked" or "coerced". ... They buy them because they WANT them.

      To be fair, they only want them because they've been made into extremely comfortable daily drivers. Anyone who's driven a truck from the 90s or earlier can attest that they were not designed with comfort in mind. They were utilitarian, with minimal passenger seating even with Crew Cab configurations. At some point – and I have no idea if this was driven by demand or not – trucks became, well, nice. I had a 2010 Honda Ridgeline until a few weeks ago, which is among the un-truck-iest of trucks, since it's unibody. That also means it's extremely comfortable, seats 5 with ease, and can still do what most people need a truck to do: carry bulky items home from Lowe's / Home Depot. Even in the 2010 model, it had niceties like heated seats. I just replaced it last week with a 2025 Ridgeline, and the new one is astonishingly nicer. Heated and ventilated seats, seat position memory, Android Auto / Apple CarPlay, adaptive cruise control, etc.

      That's also not to say that modern trucks haven't progressed in their utility. A Ford F-350 from my youth could pull 20,000 lbs. on a gooseneck in the right configuration. The 2025 model can pull 40,000 lbs., and will do it in quiet luxury, getting better fuel economy.

      [0]: https://www.irs.gov/publications/p946#idm140048254261728

      [1]: https://www.landroveroflivermore.com/section-179.htm

We have something similar in Australia with the Online Safety Act 2021. I think this highlights a critical misunderstanding at the heart of the legislation: it imagines the internet as a handful of giant platforms rather than a rich tapestry of independent, community-driven spaces. The Online Safety Act’s broad, vague requirements and potential penalties are trivial hurdles for billion-dollar companies with in-house legal teams, compliance departments, and automatic moderation tooling. But for a single individual running a forum as a labour of love—or a small collective operating on volunteer time—this creates a legal minefield where any disgruntled user can threaten real financial and personal harm.

In practice, this means the local cycling forum that fostered trust, friendship, and even mental health support is at risk of vanishing, while the megacorps sail on without a scratch. Ironically, a measure allegedly designed to rein in “Big Tech” ends up discouraging small, independent communities and pushing users toward the same large platforms the legislation was supposedly targeting.

It’s discouraging to watch governments double down on complex, top-down solutions that ignore the cultural and social value of these smaller spaces. We need policy that recognises genuine community-led forums as a public good, encourages sustainable moderation practices, and holds bad actors accountable without strangling the grassroots projects that make the internet more human. Instead, this act risks hollowing out our online diversity, leaving behind a more homogenised, corporate-dominated landscape.

  • > We have something similar in Australia with the Online Safety Act 2021.

    That wasn't the one I was thinking of, to be honest.

    I'd have thought you would be mentioning the latest ball of WTF: "Online Safety Amendment (Social Media Minimum Age) Bill 2024".

    According to the bill, HN needs to identify all Australian users to prevent under-16's from using it.

    https://www.aph.gov.au/Parliamentary_Business/Bills_Legislat...

    • That bill is an amendment to the aforementioned act.

      But yes, I'm confused as to whether it applies to online gaming, or sites such as wikipedia as well

      2 replies →

  • > it imagines the internet as a handful of giant platforms rather than a rich tapestry of independent, community-driven spaces.

    As sad as it may be, their imagination is correct. The small spaces, summed up all together, are lost in the rounding errors.

    • Also nobody is going after the small spaces, because they don't even know they exist. And when they do they can be shut down I guess, if there really is misunderstanding. I don't get preemptively doing it other than giving up after a long duty of almost 30 years and using this as excuse. At least pass them to someone else that won't care about the liability.

      3 replies →

  • You're assuming the point of these laws is what they say on the tin and the people writing these laws are ignorant. A huge amount of legislation is written by think tanks and lobbyists.

    Authoritarians don't want people to be able to talk (and organize) in private. What better way to discourage them than some "think of the children" nonsense? That's how they attacked (repeatedly) encryption.

    Google, Facebook, and Twitter all could have lobbied against this stuff and shut it down, hard. They didn't.

    That speaks volumes, and my theory is that they feel shutting down these forums will push people onto their centralized platforms, increasing ad revenues - and the government is happy because it's much easier to find out all the things someone is discussing online.

    • Google, Facebook, Twitter, etc. have really done as much as they can. Whoever are pushing this in Australian Government have a super weird kind of personal vendetta against 'Big Tech' - many speculate it's about how chummy our political class are with the media owning billionaires here in Australia, and how the shakedown they devised to wring money out of tech companies to subsidise the local media (the 'Media Bargaining Code') failed to really work.

      It's honestly super weird. Now of course they are just proposing to tax the tech companies if they don't pay money to our local media orgs for something the tech companies neither want nor care about.

      1 reply →

The actual OfCom code of practice is here: https://www.ofcom.org.uk/siteassets/resources/documents/onli...

A cycling site with 275k MAU would be in the very lowest category where compliance is things like 'having a content moderation function to review and assess suspected illegal content'. So having a report button.

  • This isn't how laws work. If you give a layperson a large law and tell him that, if he is in violation, he has to pay millions, then it pretty much doesn't matter that there is some way where, with some effort, he can comply. Most people aren't lawyers and figuring out how to actually comply with this is incredibly tedious and risky, as he is personally liable for any mistakes he makes interpreting those laws.

    Companies have legal departments, which exist to figure out answers to questions like that. This is because these questions are extremely tricky and the answers might even change as case law trickles in or rules get revised.

    Expecting individuals to interpret complex rulesets under threat of legal liability is a very good way to make sure these people stop what they are doing.

    • >This isn't how laws work.

      The law worked the same way yesterday as it does today. It's not like the website run in Britain operated under some state of anarchy and in a few months it doesn't. There's already laws a site has to comply with and the risk that someone sues you, but if you were okay with running a site for 20 years adding a report button isn't drastically going to change the nature of your business.

      4 replies →

  • This: OP seems to be throwing the baby out with the bathwater.

    Im surprised they don’t already have some form of report/flag button.

    • I’m not so sure. It’s a layman’s interpretation, but I think any “forum” would be multi-risk.

      That means you need to do CSAM scanning if you accept images, CSAM URL scanning if you accept links, and there’s a lot more than that to parse here.

      16 replies →

    • OP isn't throwing the baby with the bathwater and he explains it very well in his post: the risk of being sued is too great in itself, even if you end up winning the lawsuit.

      9 replies →

    • From how I understood the post, the forums were never self-sustaining financially and always required a considerable amount of time, so the new legislation was probably just the final straw that broke the camel's back?

      1 reply →

    • Yes they do but you need to do more than that.

      They do not have the resources to find out exactly what they need to do so that there is no risk of them being made totally bankrupt.

      If that is all - please point to the guidance or law that says just having a report button is sufficient in all cases.

      1 reply →

    • I get the same feeling as the repercussions for bad actors are fines relative to revenue, 10% if I read correctly, given that the OP has stated that they work off a deficit most of the time, I can't see this being an issue.

      Also if it is well monitored and seems to have a positive community, I don't see the major risk to shut down. Seems more shutting down out of frustration against a law that, while silly on it's face, doesn't really impact this provider.

      2 replies →

    • I am the OP, and if you read the guidance published yesterday: https://www.ofcom.org.uk/siteassets/resources/documents/onli...

      Then you will see that a forum that allows user generated content, and isn't proactively moderated (approval prior to publishing, which would never work for even a small moderately busy forum of 50 people chatting)... will fall under "All Services" and "Multi-Risk Services".

      This means I would be required to do all the following:

      1. Individual accountable for illegal content safety duties and reporting and complaints duties

      2. Written statements of responsibilities

      3. Internal monitoring and assurance

      4. Tracking evidence of new and increasing illegal harm

      5. Code of conduct regarding protection of users from illegal harm

      6. Compliance training

      7. Having a content moderation function to review and assess suspected illegal content

      8. Having a content moderation function that allows for the swift take down of illegal content

      9. Setting internal content policies

      10. Provision of materials to volunteers

      11. (Probably this because of file attachments) Using hash matching to detect and remove CSAM

      12. (Probably this, but could implement Google Safe Browser) Detecting and removing content matching listed CSAM URLs

      ...

      the list goes on.

      It is technical work, extra time, the inability to not constantly be on-call when I'm on vacation, the need for extra volunteers, training materials for volunteers, appeals processes for moderation (in addition to the flak one already receives for moderating), somehow removing accounts of proscribed organisations (who has this list, and how would I know if an account is affiliated?), etc, etc.

      Bear in mind I am a sole volunteer, and that I have a challenging and very enjoyable day job that is actually my primary focus.

      Running the forums is an extra-curricular volunteer thing, it's a thing that I do for the good it does... I don't do it for the "fun" of learning how to become a compliance officer, and to spend my evenings implementing what I know will be technically flawed efforts to scan for CSAM, and then involve time correcting those mistakes.

      I really do not think I am throwing the baby out with the bathwater, but I did stay awake last night dwelling on that very question, as the decision wasn't easily taken and I'm not at ease with it, it was a hard choice, but I believe it's the right one for what I can give to it... I've given over 28 years, there's a time to say that it's enough, the chilling effect of this legislation has changed the nature of what I was working on, and I don't accept these new conditions.

      The vast majority of the risk can be realised by a single disgruntled user on a VPN from who knows where posting a lot of abuse material when I happen to not be paying attention (travelling for work and focusing on IRL things)... and then the consequences and liability comes. This isn't risk I'm in control of, that can be easily mitigated, the effort required is high, and everyone here knows you cannot solve social issues with technical solutions.

      3 replies →

    • That's why feeling too. There will always be people who take laws and legal things overly seriously. For example, WordPress.org added a checkbox to the login to say that pineapple on pizza is delicious and there are literal posts on Twitter asking "I don't like pineapple on pizza, does this mean I can't contribute". It doesn't matter if a risk isn't even there, like who is going to be able to sue over pineapple on pizza being delicious or not? Yet, there will be people who will say "Sorry, I can't log in I don't like pineapple on pizza".

      In this case, it's "I'm shutting down my hobby that I've had for years because I have to add a report button".

  • > having a content moderation function to review and assess suspected illegal content

    That costs money. The average person can't know every law. You have to hire lawyers to adjudicate every report or otherwise assess every report as illegal. No one is going to do that for free if the penalty for being wrong is being thrown in prison.

    A fair system would be to send every report of illegal content to a judge to check if it's illegal or not. If it is the post is taken down and the prosecution starts.

    But that would cost the country an enormous amount of money. So instead the cost is passed to the operators. Which in effect means only the richest or riskiest sites can afford to continue to operate.

  • If only more people actually read the actual documents in context (same with GDPR), but the tech world has low legal literacy

    • Expecting people to read and correctly interpret complex legal documents is absurd. Obviously any lay person is heavily dissuaded by that.

      I would never except personal liability for my correct interpretation of the GDPR. I would be extremely dumb if I did.

The UK has just given up on being in any way internationally relevant. If the City of London financial district disappeared, within 10 years we'd all forget that it's still a country.

  • This feels relevant to your comment: https://archive.is/9V2Bf

    Orgs are already fleeing LSEG for deeper capital markets in the US.

    • The LSE itself isn't really _that_ important; London remains huge for financial services in general (though this _may_ be somewhat in decline for various reasons; it lost a certain amount of importance as the de facto gateway to Europe after Brexit, say).

  • As an aside, the UK is a great tourist destination, especially if you leave London right after landing.

    Beautiful landscape, the best breakfast around, really nice people, tons of sights to see.

  • How much damage can they withstand before they figure out how to stop hurting themselves? I wouldn't touch UK investment with a ten foot pole.

    • A lot more, the Online Safety Act is just a symptom of the structural problems (Lack of de-facto governance, A hopelessly out of touch political class, Voting systems that intentionally don't represent the voting results, etc).

      Argentina has had nearly 100 years of decline, Japan is onto its third lost decade. The only other party in the UK that has a chance of being elected (because of the voting system) is lead by someone who thinks sandwiches are not real [1]. It's entirely possible the UK doesn't become a serious country in our lifetimes.

      [1] https://www.politico.eu/article/uk-tory-leader-sandwiches-no...

      18 replies →

    • > How much damage can they withstand before they figure out how to stop hurting themselves?

      Funnily enough we wonder this about the USA and its drain-circling obsession with giving power -- and now grotesque, performative preemptive obeisance -- to Donald Trump.

>the Online Safety Act was supposed to hold big tech to account, but in fact they're the only ones who will be able to comply... it consolidates more on those platforms.

This says it so well, acknowledging the work of a misguided bureaucracy.

Looks like it now requires an online community to have its own bureaucracy in place, to preemptively stand by ready to effectively interact in new ways with a powerful, growing, long-established authoritarian government bureaucracy of overwhelming size and increasing overreach.

Measures like this are promulgated in such a way that only large highly prosperous outfits beyond a certain size can justify maintaining readiness for their own bureaucracies to spring into action on a full-time basis with as much staff as necessary to compare to the scale of the government bureaucracy concerned, and as concerns may arise that mattered naught before. Especially when there are new open-ended provisions for unpredictable show-stoppers, now fiercely codified to the distinct disadvantage of so many non-bureaucrats just because they are online.

If you think you are going to be able to rise to the occasion and dutifully establish your own embryonic bureaucracy for the first time to cope with this type unstable landscape, you are mistaken.

It was already bad enough before without a newly imposed, bigger moving target than everything else combined :\

Nope, these type regulations only allow firms that already have a prominent well-funded bureaucracy of their own, on a full-time basis, long-established after growing in response to less-onerous mandates of the past. Anyone else who cannot just take this in stride without batting an eye, need not apply.

  • > Looks like it now requires an online community to have its own bureaucracy in place

    What do you mean by bureaucracy in this case? Doing the risk assessment?

    • Good question.

      I would say more like the prohibitive cost of compliance comes from the non-productive (or even anti-productive) nature of the activities needed to do so, as an ongoing basis.

      An initial risk assessment is a lot more of a fixed target with a goal that is in sight if not well within reach. Once it's behind you, it's possible to get back to putting more effort into productive actions. Assessments are often sprinted through so things can get "back to normal" ASAP, which can be worth it sometimes. Other times it's a world of hurt without paying attention to whether it's a moving goalpoast and the "sprint" might need to last forever.

      Which can also be coped with successfully, like dealing with large bureaucratic institutions as customers, since that's another time when you've got to have your own little bureaucracy. To be fully dedicated to the interaction and well-staffed enough for continuous 24/7 problem-solving operation at a moment's notice. If it's just a skeleton crew at a minimum they will have a stunted ability for teamwork since the most effective deployment can be more like a relay race, where each member must pull the complete weight, go the distance, not drop the baton, and pass it with finesse.

      While outrunning a pursuing horde and their support vehicles ;)

    • OP mentions this ( https://news.ycombinator.com/item?id=42440887 ):

      > 1. Individual accountable for illegal content safety duties and reporting and complaints duties

      > 2. Written statements of responsibilities

      > 3. Internal monitoring and assurance

      > 4. Tracking evidence of new and increasing illegal harm

      > 5. Code of conduct regarding protection of users from illegal harm

      > 6. Compliance training

      > 7. Having a content moderation function to review and assess suspected illegal content

      > 8. Having a content moderation function that allows for the swift take down of illegal content

      > 9. Setting internal content policies

      > 10. Provision of materials to volunteers

      > 11. (Probably this because of file attachments) Using hash matching to detect and remove CSAM

      > 12. (Probably this, but could implement Google Safe Browser) Detecting and removing content matching listed CSAM URLs

      > ...

      > the list goes on.

      2 replies →

I did a double take when I saw this here. I’ve lurked on LFGSS, posted from time to time and bought things through it. Genuinely one of the best online communities I’ve been in, and the best cycling adjacent one by far.

Having said all that, I can’t criticise the decision. It makes me sad to see it and it feels like the end of an era online

  • Just like our knees… we might have started as fg, but now ss, and soon our Phils will not be laced for the track but for our chairs. The fight is fading.

Have you considered handing off the forums to someone based outside of the UK? I'm sure you might be able to find a reasonable steward and divest without leaving your users stranded. You've worked very hard and have something to be proud of, I would hate to see it go.

  • There’s several stories of open source projects being handed off to someone who seemed helpful, only to have them turn around and add malware months later.

    I completely understand a desire to shut things down cleanly, rather than risk something you watched over for years become something terrible.

  • This is my question as well. It seems like there would be someone willing to do this, especially outside the EU.

    Finding someone trustworthy is hard, but I know buro9 knows tons of people.

  • "The Act applies to services even if the companies providing them are outside the UK should they have links to the UK. This includes if the service has a significant number of UK users"

    • This is gonna sound crazy, but you can potentially just ignore unjust laws in countries like the UK if you don't live there. At your own risk of course, but that is the nature of protest. If OP divests completely then it should be out of his hands.

      2 replies →

    • >"...This includes if the service has a significant number of UK users".

      "[A] significant number"? How Britishly vague.

      There was one person involved in the doompf of that ceo guy....

      I would say that a significant-sized football crowd would be over 75,000.

      That's a lot of numbers that 'significant', has to lean on.

      1 reply →

Please allow us to gift you free-forever space at rsync.net to hold/stage this data - possibly in encrypted form - such that you can preserve what you have built.

Just email us.

None of this seems to describe exactly what the problem with this new act is. Can someone ELI5 what this new law does that means it's no longer safe to run your own forum?

  • I think that the fact that no one is fully sure is part of the problem.

    The act is intentionally very vague and broad.

    Generally, the gist is that it's up to the platforms themselves to assess and identify risks of "harm", implement safety measures, keep records and run audits. The guidance on what that means is very loose, but some examples might mean stringent age verifications, proactive and effective moderation and thorough assessment of all algorithms.

    If you were to ever be investigated, it will be up to someone to decide if your measures were good or you have been found lacking.

    This means you might need to spend significant time making sure that your platform can't allow "harm" to happen, and maybe you'll need to spend money on lawyers to review your "audits".

    The repercussions of being found wanting can be harsh, and so, one has to ask if it's still worth it to risk it all to run that online community?

    • > If you were to ever be investigated, it will be up to someone to decide if your measures were good or you have been found lacking.

      This is the problem with many European (and I guess also UK) laws.

      GDPR is one notable example. Very few people actually comply with it properly. Hidden "disagree" options in cookie pop-ups and unauthorized data transfers to the US are almost everywhere, not to mention the "see personalized ads or pay" business model.

      Unlike with most American laws, GDPR investigations happen through a regulator, not a privately-initiated discovery process where the suing party has an incentive to dig up as much dirt as possible, so in effect, you only get punished if you either really go overboard or are a company that the EU dislikes (which is honestly mostly just Meta at this point).

      1 reply →

    • > The act is intentionally very vague and broad

      Exactly the complaint that everyone on here made about GDPR, saying the sky would fall in. If you read UK law like an American lawyer you will find it very scary.

      But we don't have political prosecuters out to make a name for themselves, so it works ok for us.

  • From Wikipedia:

    > The act creates a new duty of care of online platforms, requiring them to take action against illegal, or legal but "harmful", content from their users. Platforms failing this duty would be liable to fines of up to £18 million or 10% of their annual turnover, whichever is higher.

  • It is essentially requiring tech companies to work for the UK government as part of law enforcement. They are required to monitor and censor users or face fines and Ofcom has the ability to shutdown things they don't like.

    This basically ensures that the only people allowed to host online services for other people in the UK will be large corporations. As they are the only ones that can afford the automation and moderation requirements imposed by this bill.

    You should be able to self-host content, but you can't do something like operate a forums website or other smaller social media platform unless you can afford to hire lawyers and spend thousands of dollars a month hiring moderators and/or implementing a bullet proof moderation system.

    Otherwise you risk simply getting shutdown by Ofcom. Or you can do everything yo are supposed to do and get shutdown anyways. Good luck navigating their appeals processes.

    • I don't mind getting shut down so much as I mind getting a fine for millions when my small little website doesn't make any money.

      But surely no right minded judge would do such a thing, right?

      1 reply →

  • https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...

    You need to do a risk assessment and keep a copy. Depending on how risky things are, you need to put more mitigations in place.

    If you have a neighbourhood events thing that people can post to, and you haven't had complaints and generally keep an eye out for misuse, that's it.

    If you run a large scale chat room for kids with suicidal thoughts where unvetted adults can talk to them in DMs you're going to have a higher set of mitigations and things in place.

    Scale is important, but it's not the only determining factor. An example of low risk for suicide harm is

    > A large vertical search service specialised in travel searches, including for flights and hotels. It has around 10 million monthly UK users. It uses recommender systems, including for suggesting destinations. It has a basic user reporting system. There has never been any evidence or suggestion of illegal suicide content appearing in search results, and the provider can see no way in which this could ever happen. Even though it is a large service, the provider concludes it has negligible or no risk for the encouraging or assisting suicide offence

    An example for high risk of grooming is

    > A social media site has over 10 million monthly UK users. It allows direct messaging and has network expansion prompts. The terms of service say the service is only for people aged 16 and over. As well as a content reporting system, the service allows users to report and block other users. While in theory only those aged 16 and over are allowed to use the service, it does not use highly effective age assurance and it is known to be used by younger children. While the service has received few reports from users of grooming, external expert organisations have highlighted that it is known to be used for grooming. It has been named in various police cases and in a prominent newspaper investigation about grooming. The provider concludes the service is high risk for grooming

> this is not a venture that can afford compliance costs... and if we did, what remains is a disproportionately high personal liability for me, and one that could easily be weaponised by disgruntled people who are banned for their egregious behaviour

I'm a little confused about this part. Does the Online Safety Act create personal liabilities for site operators (EDIT: to clarify: would a corporation not be sufficient protection)? Or are they referring to harassment they'd receive from disgruntled users?

Also, this is the first I've heard of Microcosm. It looks like some nice forum software and one I maybe would've considered for future projects. Shame to see it go.

  • The linked page has this phrasing, which I’m not entirely sure what it means, but could be understood as personal liability?

    > Senior accountability for safety. To ensure strict accountability, each provider should name a senior person accountable to their most senior governance body for compliance with their illegal content, reporting and complaints duties.

    • I don't see that makes the person personally liable at all. It just gives them a direct line to the compliance board.

  • I think OP feels it indirectly creates massive personal liabilities for site operators, in that a user can deliberately upload illegal material and then report the site under the Act, opening the site operator up to £18M in fines.

    This seems very plausible to me, given what they and other moderators have said about the lengths some people will go to online when they feel antagonised.

    • Zero chance it will be enforced like this.

      The UK has lots of regulatory bodies and they all work in broadly the same way. Provided you do the bare minimum to comply with the rules as defined in plain English by the regulator, you won't either be fined or personally liable. It's only companies that either repeatedly or maliciously fail to put basic measures in place that end up being prosecuted.

      If someone starts maliciously uploading CSAM and reporting you, provided you can demonstrate you're taking whatever measures are recommended by Ofcom for the risk level of your business (e.g. deleting reported threads and reporting to police), you'll be absolutely fine. If anything, the regulators will likely prove to be quite toothless.

      3 replies →

    • The rules aren't "never have bad things on your site" though.

      The example of "medium risk" for CSAM urls is a site with 8M users that has actively had CSAM shared on it before multiple times, been told this by multiple international organisations and has no checking on the content. It's a medium risk of it happening again.

    • Again, sounds like the nonsense spoken on here when GDPR came out. Everyone was going to get fined millions. Except people with violations actually got compliance advice from the ICO. They only got fined (a small amount of money) when they totally ignored the ICO

      2 replies →

    • While they could, I'm pretty sure that's already illegal, probably in multiple ways.

      In the same way that you could be sued for anything, I'm sure you could also be dragged to court for things like that under this law... And probably under existing laws, too.

      That doesn't mean you'll lose, though. It just means you're out some time and money and stress.

      2 replies →

Might the author be overreacting a bit to this new law? As I understand it, it doesn't put that much of an onerous demand on forum operators.

Then again, maybe he's just burnt out from running these sites and this was the final straw. I can understand if he wants to pack it in after so long, and this is as good reason as any to call it a day.

Though, has no-one in that community offered to take over? Forums do change hands now and then.

  • > Might the author be overreacting a bit to this new law

    As i have read it, no, it's worth a read to see for yourself though.

    > it doesn't put that much of an onerous demand on forum operators.

    It doesn't until it does, the issue is the massive amount of work needed to cover the "what if?".

    It's not clear that it doesn't apply and so it will be abused, that's how the internet works, DMCA, youtube strikes, domain strikes etc.

    > Then again, maybe he's just burnt out from running these sites and this was the final straw. I can understand if he wants to pack it in after so long, and this is as good reason as any to call it a day.

    Possibly, worth asking.

    > Though, has no-one in that community offered to take over? Forums do change hands now and then.

    Someone else taking over doesn't remove the problem, though there might be someone willing to assume the risk.

  • > Might the author be overreacting a bit to this new law? As I understand it, it doesn't put that much of an onerous demand on forum operators.

    As the manager of a community where people meet in person, I understand where he is coming from. Acting like law enforcement puts one in a position to confront dangerous individuals without authority or weapons. It is literally life-endangering.

The whole government page at https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c... has an off-putting and threatening tone, celebrating how wonderful it is that online spaces will be tied in bureaucratic knots. Disgraceful.

  • I host a Mastodon server in the US. I almost wish I could get a legal threat from the UK so that I could print it and hang it on my wall as a conversation piece.

    I have zero legal connection to the UK and their law doesn't mean jack to me. I look forward to thoroughly ignoring it, in the same way that I thoroughly ignore other dumb laws in other distant jurisdictions.

    UK, look back on this as the day -- well, another day -- when you destroyed your local tech in favor of the rest of the world.

  • The hacker in me is real grumpy about all this, and believes that they’re nannying a whole lot of the dumb superficial stuff while pushing serious malfeasance underground.

    But they make a good point: if you exclude the smaller providers, that’s where the drugs and CSAM and the freewheeling dialog go. Assuming it’s their policy goal to deter these categories of speech, I’m not sure how you do that without a net fine enough to scoop up the 4chans of the world too.

    It’s not the behavior of a confident, open, healthy society, though…

    • > if you exclude the smaller providers, that’s where the drugs and CSAM and the freewheeling dialog go.

      - Bad actors go everywhere now.

      - £18 million fines seem like a fairly unhinged cannon to aim at small webistes.

      - A baseless accusation is enough to trigger a risk of life-changing fines. Bad actors don't just sell drugs and freewheel; they also falsely accuse.

      1 reply →

    • As a quick cheatsheet, laws targeting CSAM are always just tools to go after other things.

      CSAM is absolutely horrible.. but CSAM laws don't stop CSAM (primarily this happens from group defections).

      Instead it's just a form of tarring, in this case unliked speech, by associating it with the most horrible thing anyone can think of.

It is pretty clear that many European countries, EU or not, do not want individuals hosting websites. Germany has quite strict rules regarding hosting, the EU has again and again proposed legislation that makes individuals hosting sites very hard and the UK doing similar things is no surprise.

These governments only want institutions to host web services. Their rules are openly hostile to individuals. One obvious benefit is much tighter control, having a few companies with large, registered sites, gives the government control.

It is also pretty clear that the public at large does not care. Most people are completely unaffected and rarely venture outside of the large, regulated platforms.

  • > do not want individuals hosting websites

    It's more about "accepting and publishing arbitrary content".

    But, in practice, how hard is it to host a website anonymously? Or off-shore?

    • >But, in practice, how hard is it to host a website anonymously? Or off-shore?

      Obviously it is trivial, but so is shoplifting.

      Both are illegal and telling people to commit crimes is not helpful.

      2 replies →

Please consider working with Archive Team and/or the Internet Archive to preserve the content of the site.

  • you don't need archiveteam to save databases you already have. just dump the PII columns and put 'em up for torrent.

    • I'd argue that despite technically being higher fidelity, the SQL dump is less useful than the archive team style http request dump, because the latter fits into the Wayback Machine while the former would require extra steps to meaningfully access the data.

      (Unless of course someone is resurrecting the site)

      1 reply →

  • FTR, the admin is now talking to ArchiveTeam and is willing to help reduce barriers to archiving the site to IA. For eg removing rate limits, allow-listing AT user-agents etc.

An insightful comment on this from an American context, but about basically the same problem [0]

> Read the regs and you can absolutely see how complying with them to allow for banana peeling could become prohibitively costly. But the debate of whether they are pro-fruit or anti-fruit misses the point. If daycares end up serving bags of chips instead of bananas, that’s the impact they’ve had. Maybe you could blame all sorts of folks for misinterpreting the regs, or applying them too strictly, or maybe you couldn’t. It doesn’t matter. This happens all the time in government, where policy makers and policy enforcers insist that the negative effects of the words they write don’t matter because that’s not how they intended them.

> I’m sorry, but they do matter. In fact, the impact – separate from the intent – is all that really matters.

[0] https://www.eatingpolicy.com/p/stop-telling-constituents-the...

  • That's an excellent article. Another quote I found especially relevant:

    >Every step that law takes down the enormous hierarchy of bureaucracy, the incentives for the public servants who operationalize it is to take a more literal, less flexible interpretation. By the time the daycare worker interacts with it, the effect of the law is often at odds with lawmakers’ intent.

    Put another way, everyone in the chain is incentivized to be very risk averse when faced with a vague regulation, and this risk aversion can compound to reach absurd places.

> as a forum moderator you are known, and you are a target

I want to emphasize just how true this is, in case anyone thinks this is hyperbole.

I managed a pissant VBulletin forum, and moderated a pretty small subreddit. The former got me woken up at 2, 3, 4am with phone calls because someone got banned and was upset about it. The latter got me death threats from someone who lived in my neighborhood, knew approximately where I lived, and knew my full name. (Would they have gone beyond the tough-guy-words-online stage? Who knows. I didn't bother waiting to find out, and resigned as moderator immediately and publicly.)

  • 100%

    I used to run a moderately sized forum for a few years. Death threats, legal threats, had faeces mailed to my house, someone found out where I worked and started making harrasing calls/turning up to the office.

    I don't run a forum no more. For what I feel are obvious reasons.

    • I, too, can confirm all this. Way back in the day, I hosted a largish forum and moderated it. It grew to be a pain to moderate, and it became costly to run as well, in short, it was no longer fun like it was initially, and I walked away from it.

  • > I managed a pissant VBulletin forum, and ... got me woken up at 2, 3, 4am with phone calls because someone got banned and was upset about it.

    I home-hosted a minecraft server and was repeatedly DDoS'd. Don't underestimate disgruntled 10yo's.

    • Facts: my friend ran a Minecraft server, and his hosting provider once told him it was the most DDoS'd server in the entire datacenter.

  • I used to be a moderator for the french forum OP mention in his post and helped maintain the previous website. I threw the towel long before they migrated to microcosm. I didn't even ask to be that moderator but ended up somehow when other "hired members" when they themselves tried to distance themselves from the moderation. It is so thankless. Some trolls were constantly making thread derails, were repeatedly banned but kept creating new accounts, joining other forums I was a simple member to stalk me and get more information in order to publish stuff about me or my family online.

People seem to forget that the more legislation there is around something the more it is only feasible to do if you are a corporate person. Human persons just don't have the same rights or protections from liabilty.

  • It also raises the barrier to entry for newcomers, ensuring established large players continue to consolidate power, since they have the means to deflect and defend themselves from these regulations (unless there are specific carve-outs in place, of course).

    • This effectively makes censorship much simpler for the government, no need to chase down a million little sites,just casually lean on the few big ones remaining.

      1 reply →

  • This is something the EU got right for once in the DMA/DSA: It only applies starting from a certain, large size - if you're that big, you can afford the overhead.

  • It's not like there are laws that are more lenient with non-profits or with tiny companies right?

    • The EU's digital markets act is one that got that right and I love it. But it's the exception to the rule. The vast majority of such laws are for the benefit of the corporations themselves, despite any ostensible purposes. And this is definitely in that latter category.

      8 replies →

  • If you read the guidance:

    https://www.ofcom.org.uk/siteassets/resources/documents/onli...

    It amounts to your basic terms of service. It means that you'll need to moderate your forums, and prove that you have a policy for moderation. (basically what all decent forums do anyway) The crucial thing is that you need to record that you've done it, and reassessed it. and prove "you understand the 17 priority areas"

    Its similar for what a trustee of a small charity is supposed to do each year for its due diligence.

    • Yep super simple. You just have to make individual value judgements every day on thousands of pieces of content for SEVENTEEN highly specific priority areas. Then keep detailed records on each value judgement such that it can hold up to legal scrutiny from an activist court official. Easy peasy.

      9 replies →

    • Most people don't have time to wade through that amount of bureaucratic legalese, much less put it into practice.

I hope you've spoken to a good lawyer briefly to understand the practical realism of your legal fears. Understanding the legal system involves far more than just literally reading text.

  • Maybe he did. Or maybe that is the first step in the very path that op doesnt want to walk.

    • Doing enough research to write this post tho is already more work than quickly calling a lawyer.

I run just over 300 forums, for a monthly audience of 275k active users

I can't imagine one person running over 300 forums with 275,000 active users. That gives you an average of eight minutes a week to tend to the needs of each one.

I used to run a single forum with 50,000 active users, and even putting 20 hours a week into it, I still didn't give it everything it needed.

I know someone currently running a forum with about 20,000 active users and it's a full-time job for him.

I don't understand how it's possible for one person to run 300 forums well.

  • He is running the infra and maintain the code. Each of those forums have their own moderators. They are basically his customers.

    I think what he fears is he has no control on how these individual forums moderate their content and how liable he would be as the hosting admin.

Remember when Omegle shut down recently? https://web.archive.org/web/20231109003559/https://omegle.co...

It seems that some people are convinced that the benefits of having strangers interact with each other are not worth the costs. I certainly disagree.

  • Omegle was widely known to be full of underage children and overage child sexual predators.

    If I designed a site for 14-year-old girls to sext with 30-year-old men it would be rightfully shut down.

    If I designed a site as a fun chat site but becomes in actual reality it became a sexting site for 14-year-old girls with adult men, should it be shut down?

I had a hosted forums for almost two decades, 4mm monthly users, etc, and can attest to the death threats and DDOS attempts (I was a very early customer of Cloudflare which basically saved us — thanks Matthew!)

The stories… people get really personally invested in their online arguments and have all sorts of bad behavior that stems from it.

It's insane that they never carved out any provisions for "non big-tech".

I feel like the whole time this was being argued and passed, everyone in power just considered the internet to be the major social media sites and never considered that a single person or smaller group will run a site.

IMO I think that you're going to get two groups of poeple emerge from this. One group will just shut down their sites to avoid running a fowl of the rules and the other group will go the "go fuck yourself" route and continue to host anonymously.

  • > I feel like the whole time this was being argued and passed, everyone in power just considered the internet to be the major social media sites and never considered that a single person or smaller group will run a site.

    Does this shock you? I don't recall a time in memory where a politician discussing technology was at best, cringe and at worst, completely incompetent and factually wrong.

  • > It's insane that they never carved out any provisions for "non big-tech".

    That would be insane, and it's not true. You have to consider the risks and impacts of your service, and scale is a key part of that.

    I think it's really important around this to actually talk about what's in the requirements, and if you think something that has gone through this much stuff is truly insane (rather than just a set of tradeoffs you're on the other side of) then it's worth asking if you have understood it. Maybe you have and lots of other people are extremely stupid, or maybe your understanding of it is off - if it's important to you in any way it probably makes sense to check right?

  • > It's insane that they never carved out any provisions for "non big-tech".

    There's only 13 provisions that apply to sites with less than 7 million users (10% of the UK population).

    7 of those are basically having an inbox where people can make a complaint and there is a process to deal with complaints.

    1 is having a 'report' button for users.

    2 say you will provide a 'terms of service'

    1 says you will remove accounts if you think they're run by terrorists.

    The OP is blowing this out of proportion.

    • You are obviously rewriting a lot of the law, and ignoring that the penalty seems to still be "up to 18 million pounds". So no, there is a deliberate bias against smaller sites.

      1 reply →

  • > It's insane that they never carved out any provisions for "non big-tech".

    Very little legislation does.

    Two things my clients have dealt with: VATMOSS and GDPR. The former was fixed with a much higher ceiling for compliance but not before causing a lot of costs and lost revenue to small businesses. GDPR treats a small businesses and non profits that just keep simple lists for people (customers, donors, members, parishioners, etc.) has to put effort into complying even thought they have a relatively small number of people's data and do not use it outside their organisation. The rules are the same as for a huge social network that buys and sells information about hundreds of millions of people.

  • How is it insane? The target is non big-tech. Do you think Facebook cares they have to hire a couple of people to do compliance?

I'm ignoring the comments, they seem to be all about the posters themselves.

I have no knowledge of your site, but I'm still sad to see it having to shut down.

What we need is some entity setup in the United Arab Emirates, Ukraine, The Democratic Republic of the Congo, anywhere that is outside of where this law will matter. Sites turned over locals, legally and in other ways.

The thing though is how to finance it and how to provide stewardship for the sites going forward.

Running sites like this post is about is not profitable. Nor is it too resource intensive.

Are you Velocio? Thanks for all the hard work!

Sad that lufguss will probably become just another channel on one of the big platforms. RIP.

  • yup, and thank you for the kind words

    • I wonder how much of the risk could be mitigated by simply disabling the private message function of microcosm? Surely having reports button help moderating the "public facing part" of a forum?

      Having said that, thanks for all the work you have done. I was (and maybe still am) a member of lfgss although I mostly lurked once in a long while without logging in and barely commented over the years.

      It is sad to see all online communities slowly migrate to discord, reddit and other walled gardens.

That's terrible. Hopefully the users can make backups (of at least what has already been posted, if not of their ongoing social connections) before the shutdown. It's good that you can provide such notice. Are you providing tarballs?

Okay, I'm putting up a new bbs in the US that is only going to be accessible via SSH modern terminal only... UK users will be more than welcome.

I've been wanting to pay with remote modern terminals and Ratatui anyway.

  • Or just use a news server and create a tilde with gopher/irc and so on, then federate it with the tildeverse.

It's much more responsible to put this whole thing into some nonprofit trust format and hand it over to someone with the time and energy to handle it. This also would not exclude you from volunteering.

  • I can understand them not wanting the due diligence burden (even if only self-imposed rather than required for any official reason) of picking a successor in this way, or the admin burden of setting up “some non-profit trust format”.

    Also depending on the terms agreed to when people signed on and started posting, it might be legally or morally difficult because transferring the data to the control of another party could be against the letter or the spirit of the terms users agreed to. Probably not, but I wouldn't want to wave such potential concerns off as “nah, it'll be fine” and hoping for the best.

    Even leaving a read-only version up, so a new home could develop with the old content remaining for reference, isn't risk free: the virtual-swatting risk that people are concerned about with this regulation would be an issue for archived content as much as live stuff.

    At least people have a full three months notice. Maybe in that time someone can come up with a transfer and continuation plan the current maintainer is happy with, if not the users at least have some time to try to move any social connectivity based around the site elsewhere.

> The act only cares that is it "linked to the UK" (by me being involved as a UK native and resident, by you being a UK based user), and that users can talk to other users... that's it, that's the scope.

So basically is this act a ban on indvidual communication through undermoderated platforms?

This seems like a classic "Don't interrupt your adversary when they are making a mistake" situation.

The EU and UK have been making these anti-tech, anti-freedom moves for years. Nothing can be better if you are from the US. Just hoover up talent from their continent.

  • Have you seen how difficult US immigration is to navigate? It's impossible for most people, and about to get even harder soon.

    Even if US immigration were more liberal, moving is very costly (financially, emotionally, psychologically). Injustice anywhere is a threat to justice everywhere.

In addition to the cookie privacy pop-over when viewing that site, I (as an American) am just amazed how regulated the internet is in Europe compared to the USA.

Is there an argument why we would want it any other way?

  • > In addition to the cookie privacy pop-over when viewing that site

    I don't know where you're seeing that as the site does not have such things. The only cookies present are essential and so nothing further was needed.

    The site does not track you, sell your data, or otherwise test you as a source of monetisation. Without such things conforming with cookie laws is trivial... You are conformant by just connecting nothing that isn't essential to providing the service.

    For most of the sites only a single cookie is set for the session, and for the few via cloudflare those cookies get set too.

  • Why we would want it any other way than what? It's not clear to me whether you see the added European regulation as positive or negative.

I'm really sad this stuff is happening. For me the hobby sites are by far the best part of the internet. All the commercial stuff gets enshittified. I hardly use any commercial social media or forums anymore.

I don't believe this kind of regulation will do anything but put the real criminals more underground while killing all these helpful community initiatives. It's just window dressing for electoral purposes.

as a lurker here and at lfgss, just wanted to say lfgss exposed me to so much that i'm thankful for (not only fg/ss but also in cycling culture + more)

so thanks for all that buro9! <3

I'm wondering, would putting the forum behind auth wall 'solve' this 'problem'? Forum users already have accounts, and make it not too difficult for new users to sign up.. Otherwise content would be not accessible to unauthed users.

Or another thought, distribute it only through VPN, OpenVPN can be installed on mobiles these days (I have one installed on my Android). Make keys creation part of registration process.

Why not hand it over to someone else who would take the risk?

Seems a bit megalomaniacal.

"I'm not interested in doing this any more. Therefore I'll shut it down for everyone"

  • Because then they're responsible, at least socially, for the things the new admin does.

    This way, people have been given plenty of advanced notice and can start their own forums somewhere instead. I'm sure each of the 300 subforums already has some people running them, and they could do the above if they actually cared.

    I find it hard to believe someone will take over 300 forums out of the goodness of their hearts and not start making it worse eventually, if not immediately.

Does everyone else remember when GDPR came out and everyone running a website was extradited to Europe and fined a billion pounds.

I don't understand this decision. Running a website as an individual is a liability risk for all sorts of reasons for which there are simple (and cheap) mitigations. Even if you believe this legislation is a risk, there are options other than shutting down. The overreaction here is no different than when GDPR came in, and we all collectively lost our minds and started shutting things down and then discovered there was zero consequence for mom-and-pop websites. I assume this isn't a genuine post and is actually an attempt at some sort of protest, with no intention of actually shutting down the websites. Or, more likely, they're just old and tired and ready to move on from this period of their life, running these websites.

  • the real risk I see is that as it's written, and as Ofcom are communicating, there is now a digital version of a SWATing for disgruntled individuals.

    the liability is very high, and whilst I would perceive the risk to be low if it were based on how we moderate... the real risk is what happens when one moderates another person.

    as I outlined, whether it's attempts to revoke the domain names with ICANN, or fake DMCA reports to hosting companies, or stalkers, or pizzas being ordered to your door, or being signed up to porn sites, or being DOX'd, or being bombarded with emails... all of this stuff has happened, and happens.

    but the new risk is that there is nothing about the Online Safety Act or Ofcom's communication that gives me confidence that this cannot be weaponised against myself, as the person who ultimately does the moderation and runs the site.

    and that risk changes even more in the current culture war climate, given that I've come out, and that those attacks now take a personal aspect too.

    the risk feels too high for me personally. it's, a lot.

    • > the real risk I see is that as it's written, and as Ofcom are communicating, there is now a digital version of a SWATing for disgruntled individuals.

      I'm sorry, what precisely do you mean by this? The rules don't punish you for illegal content ending up on your site, so you can't have a user upload something then report it and you get in trouble.

      6 replies →

  • I used to frequent the forum about 15 or so years ago. This guy is very level headed and has been around the block a lot. Therefore I don't believe this is purely performative.

    • I like and respect the OP and their work. I do not think this is consistent with his previous levelheadedness.

      edit: removed unintentional deadnaming

      2 replies →

  • A fair number of sites hosted and operated outside the European Union reacted to GDPR by instituting blocks of EU users, many returning HTTP 451. Regardless of whether you believe GDPR is a good idea or not (that's beyond the scope of this comment), the disparity in statutory and regulatory approaches plus widely varying (often poor) levels of 'plain language' clarity in obligations, and inconsistent enforcement, it all leads to entirely understandable decisions like this and more of a divided internet.

    Thank you to those who have tirelessly run these online communities for decades, I'm sorry we can't collectively elect lawmakers who are more educated about the real challenges online, and thoughtful on real ways to solve them.

    • >A fair number of sites hosted and operated outside the European Union reacted to GDPR by instituting blocks of EU users, many returning HTTP 451.

      My outlook on doing this is that this is not the way to do it because these things exist:

      - EU citizens living in non-EU countries (isn't GDPR supposed to apply EU citizens worldwide?)

      - EU citizens using VPN with exit node to/IP address spoofing a non-EU country

      Either comply with GDPR or just don't exist, period.

      1 reply →

  • What are the simple and cheap mitigations you have in mind?

    • Don't run a website personally, set up a separate legal entity. The UK is one of the easiest places in the world to do this and has well-understood legal entities that fit the model of a community-operated organisation (i.e: "community interest company"). The fact that the OP is running such a large community as an individual is bonkers in the first place, independent of this new act.

      32 replies →

Hopefully you guys can somehow fight this, Have you contacted any big news sites about this? Also I think its likely there going to be alot of Judicial reviews and legal challenges to this. I don't see how this will hold up under the ECHR.

[flagged]

  • Ah, yes, “just” run every comment from 275k users through an error prone system, while paying for every API call, to host something they already do at a loss.

    • Ah yes, "just" run every new comment through an AI system which costs peanuts for a binary response, and then covers them for having a moderation policy.

      2 replies →

[flagged]

  • > Block UK users

    The site is primarily focused on London/UK biking enthusiasts.

    > Make a forum that is only for UK users

    That is the forum for UK users.

    > Just ignore the law and fight it

    The linked post mentions that the fines for failure to comply start at £18 million. I'd understand not wanting to take that risk.

    > Setup the forums on bulletproof hosting which ignore such silly laws.

    I think this is the most viable strategy, but even then the site owner incurs risks through ie. ownership of the domain or considerable participation.

The laughable thing is believing that Ofcom has any budget to prosecute anyone, let alone a small website.

>Any monies donated in excess of what is needed to provide the service through to 16th March 2025 will be spent personally on unnecessary bike gear or astrophotography equipment, but more likely on my transition costs as being transfemme I can tell you there is zero NHS support and I'm busy doing everything DIY (reminder to myself, need to go buy some blood tests so I can guess my next dosage change)... Not that I imagine there will be an excess, but hey, I must be clear about what would happen if there were an excess.

I would argue the honorable thing to do in the event excess monies remain would be to donate it to a charity. Using it for personal ends, whatever the details, is wrong because that's not what the donations were for.

I haven't read the act and am not going to, but, for this size community I'm pretty sure having a flag/report button would do the trick, and to go the extra mile, with very cheap LLM's generating a "dodgy content" score on every message would be pretty trivial. This seems a bit knee-jerk of a reaction to delete the whole site.