Comment by iamthepieman

5 months ago

>XML has always seemed to be a data standard which is intended to be what computers prefer, not people

Interesting take, but I'm always a little hesitant to accept any anthropomorphizing of computer systems.

Isn't it always about what we can reason and extrapolate about what the computer is doing? Obviously computers have no preference so it seems like you're really saying

"XML is a poor abstraction for what it's trying to accomplish" or something like that.

Before jQuery, chrome, and web 2.0, I was building xslt driven web pages that transformed XML in an early nosql doc store into html and it worked quite beautifully and allowed us to skip a lot of schema work that we definitely were ready or knowledgeable enough to do.

EDIT: It was the perfect abstraction and tool for that job. However the application was very niche and I've never found a person or team who did anything similar (and never had the opportunity to do anything similar myself again)

I did this for many years at a couple different companies. As you said it worked very well especially at the time (early 2000’s). It was a great way to separate application logic from presentation logic especially for anything web based. Seems like a trivial idea now but at the time I loved it.

In fact the RSS reader I built still uses XSLT to transform the output to HTML as it’s just the easiest way to do so (and can now be done directly in the browser).

Re xslt based web applications - a team at my employer did the same circa 2004. It worked beautifully except for one issue: inefficiency. The qps that the app could serve was laughable because each page request went through the xslt engine more than once. No amount of tuning could fix this design flaw, and the project was killed.

Names withheld to protect the guilty. :)

  • Most every request goes through xslt in our team's corporate app. The other app teams are jealous of our performance.