Comment by otterley
3 months ago
Also, according to Chrome's telemetry, very, very few websites are using it in practice. It's not like the proposal is threatening to make some significant portion of the web inaccessible. At least we can see the data underlying the proposal here.
Sadly, I just built a web site with HTMX and am using the client-side-templates extension for client-side XSLT.
>very, very few websites
Doesn't include all the corporate web sites that they are probably blocked from getting such telemetry for. These are the users that are pushing back.
Does that library use the browser's xslt?
I'm curious as to the scope of the problem, if html spec drops xslt, what the solutions would be; I've never really used xslt (once maybe, 20 years ago). In addition to just pre-rendering your webpage server-side, I assume another possible solution is some javascript library that does the transformations, if it needed to be client-side?
Found a js-only library, so someone has done this before: https://www.npmjs.com/package/xslt-processor
1. Chrome telemetry underreports a lot of use cases
2. They have a semi-internal document https://docs.google.com/document/d/1RC-pBBvsazYfCNNUSkPqAVpS... that explicitly states: small usage percentage doesn't mean you can safely remove a feature
--- start quote ---
As a general rule of thumb, 0.1% of PageVisits (1 in 1000) is large, while 0.001% is considered small but non-trivial. Anything below about 0.00001% (1 in 10 million) is generally considered trivial.
There are around 771 billion web pages viewed in Chrome every month (not counting other Chromium-based browsers). So seriously breaking even 0.0001% still results in someone being frustrated every 3 seconds, and so not to be taken lightly!
--- end quote ---
3. Any feature removal on the web has to be a) given thorough thought and investigation which we haven't seen. Library of congress apparently uses XSLT and Chrome devs couldn't care less
Hmm, I don't see the LOC listed here among the top sites: https://chromestatus.com/metrics/feature/timeline/popularity... - where are you seeing the Library of Congress as impacted?
This was mentioned in the discussions and are an easy search away. Which means that googlers in their arrogance didn't do any research at all and that their counter underrepresents data as explicitly stated in their own document
https://www.loc.gov/standards/mods/mods-conversions.html
https://www.loc.gov/preservation/digital/formats/fdd/fdd_xml...
And then there's Congress: https://simonwillison.net/2025/Aug/19/xslt/
4 replies →
>Chrome telemetry underreports a lot of use cases Sure; in that case, I would suggest to the people with those use cases that they should stop switching off telemetry. Everyone on HN seems to forget telemetry isn't there for shits and giggles, it's there to help improve a product. If you refuse to help improve the product, don't expect a company to improve the product for you, for free.
Looking at the problem differently. Say some change would make Hacker News unusable, the data would support this and show that it practically affects no one.
First, we are an insignificant portion of the web, and it's okay to admit that.
Second, if HN were built upon outdated Web standards practically nobody else uses, I'm sure YCombinator could address the issue before the deadline (which would probably be at least a year or two out) to meet the needs of its community. Every plant needs nourishment to survive.
It's not OK for the Google & co to chip away at "insignificant" portions of the web until all that's left are big corporate run platforms.
1 reply →
The people writing, and visiting websites that rely on XSLT are the same users that disable or patch out telemetry.
A LOT of internal corpo websites use XSLT.