← Back to context

Comment by superkuh

7 months ago

People seem to forget that the more legislation there is around something the more it is only feasible to do if you are a corporate person. Human persons just don't have the same rights or protections from liabilty.

It also raises the barrier to entry for newcomers, ensuring established large players continue to consolidate power, since they have the means to deflect and defend themselves from these regulations (unless there are specific carve-outs in place, of course).

  • This effectively makes censorship much simpler for the government, no need to chase down a million little sites,just casually lean on the few big ones remaining.

This is something the EU got right for once in the DMA/DSA: It only applies starting from a certain, large size - if you're that big, you can afford the overhead.

It's not like there are laws that are more lenient with non-profits or with tiny companies right?

  • The EU's digital markets act is one that got that right and I love it. But it's the exception to the rule. The vast majority of such laws are for the benefit of the corporations themselves, despite any ostensible purposes. And this is definitely in that latter category.

    • "glad that EU overregulation doesn't hamper the freedom of the United kingdom any more."

      what can we do about this creep up of totalitarian surveillance plutocracy?

      sweet were the 1990s with a dream.of.information access for all.

      little did we know we were the information being accessed.

      srry

      very un-HN-y.. maybe it's just the time of the year but this really pulls me down currently.

    • Also a lot of other EU regulations do the same.

      Sometimes it's explicitly mentioned but oftentimes it's behind "appropriate and proportionate measures"

      6 replies →

If you read the guidance:

https://www.ofcom.org.uk/siteassets/resources/documents/onli...

It amounts to your basic terms of service. It means that you'll need to moderate your forums, and prove that you have a policy for moderation. (basically what all decent forums do anyway) The crucial thing is that you need to record that you've done it, and reassessed it. and prove "you understand the 17 priority areas"

Its similar for what a trustee of a small charity is supposed to do each year for its due diligence.

  • Yep super simple. You just have to make individual value judgements every day on thousands of pieces of content for SEVENTEEN highly specific priority areas. Then keep detailed records on each value judgement such that it can hold up to legal scrutiny from an activist court official. Easy peasy.

    • No, not at all. You need to consider your service's risks against those seventeen categories once, and then review your assessment at least every year.

      From the linked document above: "You need to keep a record of each illegal content risk assessment you carry out", "service providers may fully review their risk assessment (for example, as a matter of course every year)"

      And links to a guidance document on reviewing the risk assessment[1] which says: "As a minimum, we consider that service providers should undertake a compliance review once a year".

      [1] https://www.ofcom.org.uk/siteassets/resources/documents/onli...

    • > You just have to make individual value judgements every day on thousands of pieces of content

      That's simply not true.

    • If you read that guidance, it wants you to have a moderation policy for 17 specific priority areas. You need prove you can demonstrate that you have thought about it. You need to have a paper trail that says you have a policy and that its a policy. You _could_ be issued with a "information notice", which you have to comply with. Now, you could get that already, with the RIPA, as a communications provider.

      this is similar to running a cricket club, or scout club

      For running a scout association each lesson could technically require an individual risk assessment for every piece of equipment, and lesson. The hall needs to be safe, and you need to prove that it's safe. Also GDPR, and safeguarding, background checks, money laundering.

      > hold up to legal scrutiny from an activist court official

      Its not the USA. activist court officials require a functioning court system. Plus common law has the concept of reasonable. A moderated forum will be of a much higher standard of moderation than facebook/twitter/tiktok.

    • Any competent forum operator is already doing all of this (and more) just without the government-imposed framework. Would the OP allow CSAM to be posted on their website? No. Would the OP contact the authorities if they caught someone distributing CSAM on their website? Yes. Forum administrators are famous (to the point of being a meme) for their love of rules and policies and procedures.

      5 replies →

  • Is the trustee of a small charity on the hook for £18,000,000 in minimum fines?

    • The maximum fines are 10% of "qualifying worldwide revenue", or £18M, whichever is larger. This is an exercise in stopping companies from claiming tiny revenues when they're actually much larger, rather than fining genuinely tiny companies (or individuals) a ridiculous multiple of their value (or wealth).

      Plenty of things in UK law attract "an unlimited fine", but even that doesn't lead to people actually being fined amounts greater than all the money that's ever existed.

      1 reply →

    • Trustees for small charities can be personally liable for unlimited amounts.

      GDPR, Safeguarding, liability for the building you operate in, money laundering. there are lots of laws you are liable for.

  • Most people don't have time to wade through that amount of bureaucratic legalese, much less put it into practice.