Comment by hombre_fatal

3 years ago

Well, until you have an algo that can mind read, "I'm not a spammer guys, gosh!" isn't good enough, I'm afraid.

And yes, it's annoying that we live in that world. In 1999 you could probably assume a request was human with a User-Agent regex.

In 2024, your smart toaster could be saturating your AT&T Fiber uplink without you even knowing while you're rage-posting in Cloudflare's forums about HAR files and how you're not a bot.

> until you have an algo that can mind read, "I'm not a spammer guys, gosh!" isn't good enough, I'm afraid.

As mentioned, it works fine in Chrome on the same computer. CloudFlare has engaged and is investigating, thanks to this HN post.

  • A single Chrome install is easier to identify than a single Firefox install with default settings. Firefox is also an outlier in terms of global browser traffic (3-5% for normal websites).

    • If there is some Firefox privacy feature that CloudFlare considers overbearing, I'd consider turning it off, but I don't even know what the problem is. CloudFlare provides zero diagnostics and didn't engage in the community post. These two latter points are what annoy me. If CloudFlare has some philosophical disagreement with Firefox, then fine, but tell me what it is so that I can consider disabling the Firefox feature.

> Well, until you have an algo that can mind read, "I'm not a spammer guys, gosh!" isn't good enough, I'm afraid.

Yet read-only access to websites, which by definition can't be used for spam, is also locked behind Cloudflare. The same old excuse every time - they're given a legitimate inch for security, but take a mile.

Most telling is that you don't even get heavily rate-limited access to a website without passing Cloudflare's filter. Because then your actual behavior could be used to determine if/how much of a DDoS threat you are. But that would take away Cloudflare's excuse to monitor users, so they prefer to use absolutes.

  • But it can be used for DDOS, especially when the content is dynamic

    • Most websites don't have any businesses serving truly dynamic content to anonymous users. If basic page request cost enough that you need to engage in this kind of overzealous blocking then you should fix your page generation and caching.

I propose we begin implementing some responsibility for internet actors. If my car leaks oil on the road, that is my responsibility to fix, yet I did not manufacture the car.

I propose that we make owners of shitty devices responsible for their actions. if my internet of shit thermostat begins spamming people, that would be my responsibility, if it participates in a ddos, that would become my responsibility.

  • That's already true. If you're found sending abusive traffic, you might get sued, get sent a C&D, and/or your ISP might cut off your internet.

    But similar to somebody's leaky car, good luck finding them and enforcing they actually clean it up.