← Back to context

Comment by userbinator

9 days ago

but on the other hand, having a rising browser engine might eventually remove this avenue for fingerprinting

If what I've seen from CloudFlare et.al. are any indication, it's the exact opposite --- the amount of fingerprinting and "exploitation" of implementation-defined behaviour has increased significantly in the past few months, likely in an attempt to kill off other browser engines; the incumbents do not like competition at all.

The enemy has been trying to spin it as "AI bots DDoSing" but one wonders how much of that was their own doing...

It's entirely deliberate. CloudFlare could certainly distinguish low-volume but legit web browsers from bots, as much as they can distinguish chrome/edge/safari/firefox from bots. That is if they cared to.

Hold up, one of those things is not like the other. Are we really blaming webmasters for 100x increases in costs from a huge wave of poorly written and maliciously aggressive bots?

  • > Are we really blaming...

    No, they're discussing increased fingerprinting / browser profiling recently and how it affects low-market-share browsers.

    • I saw that, but I'm still not sure how this fits in:

      > The enemy has been trying to spin it as "AI bots DDoSing" but one wonders how much of that was their own doing...

      I'm reading that as `enemy == fingerprinters`, `that == AI bots DDoSing`, and `their own == webmasters, hosting providers, and CDNs (i.e., the fingerprinters)`, which sounds pretty straightforwardly like the fingerprinters are responsible for the DDoSing they're receiving.

      That interpretation doesn't seem to match the rest of the post though. Do you happen to have a better one?

      2 replies →

  • Your costs only went up 100x if you built your site poorly

    • I'll bite. How do you serve 100x the traffic without 100x the costs? It costs something like 1e-10 dollars to serve a recipe page with a few photos, for example. If you serve it 100x more times, how does that not scale up?

      2 replies →

I dont think they're doing this to kill off browser engines; they're trying to sift browsers into "user" and "AI slop", so they can prioritize users.

This is entirely web crawler 2.0 apocolypse.

  • I think "slop" only refers to the output of generative AI systems. bot, crawler, scraper, or spider would be a more apt term for software making (excessive) requests to collect data.