Comment by ebiester

7 months ago

I think you're being underly charitable. The vast majority of congress critters are pretty smart people, and by Jeff Jackson's account, even the ones who yell the loudest are generally reasonable behind closed doors due to incentives.

The problem is that the real problems are very hard, and their job is to simplify it to their constituents well enough to keep their jobs, which may or may not line up with doing the right thing.

This is a truly hard problem. CSAM is a real problem, and those who engage in its distribution are experts in subverting the system. So is freedom of expression. So is the onerous imposition of regulations.

And any such issue (whether it be transnational migration, or infrastructure, or EPA regulations in America, or whatever issue you want to bring up) is going to have some very complex tradeoffs and even if you have a set of Ph.Ds in the room with no political pressure, you are going to have uncomfortable tradeoffs.

What if the regulations are bad because the problem is so hard we can't make good ones, even with the best and brightest?

It's ridiculous to say that a bad law is better than no law at all. If the law has massive collateral damage and little-to-no demonstrated benefit then it's just a bad law and should never have been made.

It seems far too common that regulations are putting the liability / responsibility for a problem onto some group of people who are not the cause of the problem, and further, have limited power to do anything about the problem.

As they say, this is why we can't have nice things.

  • > responsibility for a problem onto some group of people who are not the cause of the problem

    You don't think Meta, TikTok etc are the cause of the problem ?

    I appreciate that Lfgss is somewhat collateral damage but the fact is that if you're going to run a forum you do have some obligation to moderate it.

    • The "collateral damage" you're talking about represented the UK's best answer to Meta - a UK-run collection of online communities that people were choosing to use instead of foreign alternatives. If they ban running them domestically then everybody will use American ones...

    • > some obligation to moderate it

      "some"?

      > The Act would also require me to scan images uploading for Child Sexual Abuse Material and other harmful content, it requires me to register as the responsible person for this and file compliance. It places technical costs, time costs, risk, and liability, onto myself as the volunteer who runs it all... and even if someone else took it over those costs would pass to them if the users are based in the UK.

      There is no CSAM ring hiding on this cycling forum. The notion that every service which transmits data from one user to another has to file compliance paperwork and pay to use a CSAM hashing service is absurd.

      1 reply →

    • > I appreciate that Lfgss is somewhat collateral damage but the fact is that if you're going to run a forum you do have some obligation to moderate it.

      Lfgss is heavily moderated, just maybe not in a way you could prove to a regulator without an expensive legal team...

    • True, we absolutely couldn’t allow a place that people can voluntarily participate in to say things to exist without a governing body deciding what is and isn’t allowed to be said

> What if the regulations are bad because the problem is so hard we can't make good ones, even with the best and brightest?

To begin with, the premise would have to be challenged. Many, many bad regulations are bad because of incompetence or corruption rather than because better regulations are impossible. But let's consider the case where there really are no good regulations.

This often happens in situations where e.g. bad actors have more resources, or are willing to spend more resources, to subvert a system than ordinary people. For example, suppose the proposal is to ban major companies from implementing end-to-end encryption so the police can spy on terrorists. Well, that's not going to work very well because the terrorists will just use a different system that provides E2EE anyway and what you're really doing is compromising the security of all the law-abiding people who are now more vulnerable to criminals and foreign espionage etc.

The answer in these cases, where there are only bad policy proposals, is to do nothing. Accept that you don't have a good solution and a bad solution makes things worse rather than better so the absence of any rule, imperfect as the outcome may be, is the best we know how to do.

The classical example of this is the First Amendment. People say bad stuff, we don't like it, they suck and should shut up. But there is nobody you can actually trust to be the decider of who gets to say what, so the answer is nobody decides for everybody and imposing government punishment for speech is forbidden.

  • > The answer in these cases, where there are only bad policy proposals, is to do nothing.

    Or go further.

    Sometimes the answer is to remove regulations. Specifically, those laws that protect wrongdoers and facilitators of problems. Then you just let nature take its course.

    For the mostpart though, this is considered inhumane and unacceptable.

    • Sometimes we do exactly that. In general, if someone is trying to kill you, you are allowed to try and kill them right back. It's self-defense.

      If you're talking about legalizing vigilantism, you would then have to argue that this is a better system and less prone to abuse than some variant of the existing law enforcement apparatus. Which, if you could do it, would imply that we actually should do that. But in general vigilantes have serious problems with accurately identifying targets and collateral damage.

      10 replies →

And sometimes good regulations are really hard to swallow for the uninformed, while bad regulations sound really good on paper.

"children are getting raped and we aren't going to do anything about it because we want to protect indie websites" sounds a lot worse than "this is a significant step in combatting the spread of online child pornography", even if reality is actually far more complicated.

> This is a truly hard problem.

CSAM is NOT a hard problem. You solve it with police work. That's how it always gets solved.

You don't solve CSAM with scanners. You don't solve CSAM with legislation. You don't solve CSAM by banning encryption.

You solve CSAM by giving money to law enforcement to go after CSAM.

But, see, the entities pushing these laws don't actually care about CSAM.

“Their job is to simplify it to their constituents well enough to keep their jobs” sounds awfully similar to what I’m saying. Maybe “don’t care” is a little too absolute, but it doesn’t make much difference if they don’t care or if they care but their priority is still keeping their jobs.