Comment by aimazon

7 months ago

I don't understand this decision. Running a website as an individual is a liability risk for all sorts of reasons for which there are simple (and cheap) mitigations. Even if you believe this legislation is a risk, there are options other than shutting down. The overreaction here is no different than when GDPR came in, and we all collectively lost our minds and started shutting things down and then discovered there was zero consequence for mom-and-pop websites. I assume this isn't a genuine post and is actually an attempt at some sort of protest, with no intention of actually shutting down the websites. Or, more likely, they're just old and tired and ready to move on from this period of their life, running these websites.

the real risk I see is that as it's written, and as Ofcom are communicating, there is now a digital version of a SWATing for disgruntled individuals.

the liability is very high, and whilst I would perceive the risk to be low if it were based on how we moderate... the real risk is what happens when one moderates another person.

as I outlined, whether it's attempts to revoke the domain names with ICANN, or fake DMCA reports to hosting companies, or stalkers, or pizzas being ordered to your door, or being signed up to porn sites, or being DOX'd, or being bombarded with emails... all of this stuff has happened, and happens.

but the new risk is that there is nothing about the Online Safety Act or Ofcom's communication that gives me confidence that this cannot be weaponised against myself, as the person who ultimately does the moderation and runs the site.

and that risk changes even more in the current culture war climate, given that I've come out, and that those attacks now take a personal aspect too.

the risk feels too high for me personally. it's, a lot.

  • > the real risk I see is that as it's written, and as Ofcom are communicating, there is now a digital version of a SWATing for disgruntled individuals.

    I'm sorry, what precisely do you mean by this? The rules don't punish you for illegal content ending up on your site, so you can't have a user upload something then report it and you get in trouble.

    • Yes you can https://www.ofcom.org.uk/siteassets/resources/documents/onli...

      A forum that isn't proactively monitored (approval before publishing) is in the "Multi-Risk service" category (see page 77 of that link), and the "kinds of illegal harm" include things as obvious as "users encountering CSAM" and as nebulous as "users encountering Hate".

      Does no-one recall Slashdot and the https://en.wikipedia.org/wiki/Gay_Nigger_Association_of_Amer... trolls? Such activity would make the site owner liable under this law.

      You might glibly reply that we should moderate, take it down, etc... but we, is me... a single individual who likes to go hiking off-grid for a vacation and to look at stars at night. There are enough times when I could not respond in the timely way to moderate things.

      This is what I mean by the Act providing a weapon to disgruntled users, trolls, those who have been moderated... a service providing user generated content in a user to user environment can trivially be weaponised, and it will be a very short amount of time before it happens.

      Forum invasions by 4chan and others make this extremely obvious.

      1 reply →

I used to frequent the forum about 15 or so years ago. This guy is very level headed and has been around the block a lot. Therefore I don't believe this is purely performative.

  • I like and respect the OP and their work. I do not think this is consistent with his previous levelheadedness.

    edit: removed unintentional deadnaming

A fair number of sites hosted and operated outside the European Union reacted to GDPR by instituting blocks of EU users, many returning HTTP 451. Regardless of whether you believe GDPR is a good idea or not (that's beyond the scope of this comment), the disparity in statutory and regulatory approaches plus widely varying (often poor) levels of 'plain language' clarity in obligations, and inconsistent enforcement, it all leads to entirely understandable decisions like this and more of a divided internet.

Thank you to those who have tirelessly run these online communities for decades, I'm sorry we can't collectively elect lawmakers who are more educated about the real challenges online, and thoughtful on real ways to solve them.

  • >A fair number of sites hosted and operated outside the European Union reacted to GDPR by instituting blocks of EU users, many returning HTTP 451.

    My outlook on doing this is that this is not the way to do it because these things exist:

    - EU citizens living in non-EU countries (isn't GDPR supposed to apply EU citizens worldwide?)

    - EU citizens using VPN with exit node to/IP address spoofing a non-EU country

    Either comply with GDPR or just don't exist, period.

    • China or Russia also have "interesting" data protection / "let's protect children" laws. Some of they also formulated in same way as GDPR so VPN doesn't help. Why should they be ignored? (other than "but it's DIFFERENT thing, EU is good ones")

What are the simple and cheap mitigations you have in mind?

  • Don't run a website personally, set up a separate legal entity. The UK is one of the easiest places in the world to do this and has well-understood legal entities that fit the model of a community-operated organisation (i.e: "community interest company"). The fact that the OP is running such a large community as an individual is bonkers in the first place, independent of this new act.

    • It raises the cost and hassle involved from "I need a cheap hosting package" to I need to do paperwork, keep and file accounts, etc.

    • Are you claiming that setting up a CIC removes individual liability for wrongdoing? So, I set up a CIC for running forums, with $0 of assets and negligible running costs, then in the event of a fine I'm scot free?

      23 replies →