← Back to context

Comment by fake-name

3 years ago

Cloudflare is likely one of the worst things that has happened to the internet in recent history.

Like, I get the need for some protective mechanisms for interactive content/posting/etc, but there should be zero cases where a simple HTTP 200 GET requires javascript/client side crap. If they serve me a slightly stale version of the remote resource (5 minutes/whatnot) that's fine.

They've effectively just turned into a google protection racket. Small/special purpose search/archive tools are just stonewalled.

You can't turn it off as a Cloudflare customer either.

The best you've got is "essentially off" but that wording is such because even with everything disabled there are still edge cases where their security will enforce a JS challenge or CAPTCHA.

  • At least on their basic plan there is also little to no indication of how often this is triggering. Leading to having know idea what the various settings are.

Not to be too dismissive of this, but for companies trying to just run a service and getting constantly bombarded by stuff like DDoS issues, Cloudflare and its ilk lets them service a large portion of "legitimate" users, compared to none.

I don't really know how you resolve that absent just like... putting everything behind logins, though.

> If they serve me a slightly stale version of the remote resource (5 minutes/whatnot) that's fine.

Not all sites are configured to do this. Some pages are expensive to render and have no cache layer.

  • I get that, my point is it's the problem.

    They solve the DDOS issue by requiring JS captchas (which fundamentally breaks the way the internet should work), rather then serving a cache of the page to reduce load on the real host.

    Requiring JS doesn't disambiguate between well behaved automated (or headless. I used a custom proxy for a lot of my content browsing) user agents and malicious users, it breaks /all/ of them.

  • Some people shoot themselves in the foot, yes. There is no reason to not have some amount of microcaching even if it is very short and that puts an upper limit on the request rate per resource behind the caching layer.