Comment by user____name

8 days ago

The problem with browsers is that they rely on other standards. If a browser needs to maintain backward compatibility and requires an evolving standard that ALSO requires backward compatibility, this acts like a multiplier on implementation complexity. This also means it becomes increasingly difficult to spin up a new browser engine, and the fewer of those there are, the easier it is for the big ones to just add what they want and have it become a standard, reinforcing the issue.

XSLT adds ~100k lines of specialized code to browser engines with near-zero telemetry usage (<0.01% of page loads), making it a prime example of the maintenance burden vs. utility problem you're describing.

  • > near-zero telemetry usage (<0.01% of page loads),

    Perhaps without realising it, you are still describing numbers in the hundreds of millions perhaps billions, which isn't even close to zero when you're talking about atoms:

    Google does 200$ billion USD in annual ad revenue: At even a mere $1 CPM that's 200 trillion impressions a year.

    They put 5 ads on every page and we're at 40 trillion page loads a year that Google knows about.

    You tell me that even 0.001% of that is XML/XSLT and we're still talking hundreds of millions of page loads every year.

    And that's just the ones Google knows about: Pages without Google ads like say https://www.congress.gov/117/bills/hr3617/BILLS-117hr3617ih.... should definitely not be included in that telemetry at all, indicating the value should be much higher