Comment by StopDisinfo910
3 months ago
> Calling XML human readable is a stretch.
That’s always been the main flaw of XML.
There are very few use case where you wouldn’t be better served by an equivalent more efficient binary format.
You will need a tool to debug xml anyway as soon as it gets a bit complex.
With this you have efficient binary format and generality of XML
https://en.m.wikipedia.org/wiki/Efficient_XML_Interchange
But somehow google forgot to implement this.
A simple text editor of today (Vim, KATE) can real-time sanity check an XML file. Why debug?
Because issue with XML are pretty much never sanity check. After all XML is pretty much never written by hand but by tools which will most likely produce valid xml.
Most of the time you will actually be debugging what’s inside the file to understand why it caused an issue and find if that comes from the writing or receiving side.
It’s pretty much like with a binary format honestly. XML basically has all the downside of one with none of the upside.
I mean, I found it pretty trivial to write parsers for my XML files, which are not simple ones, TBH. The simplest one of contains a bit more than 1700 lines.
It's also pretty easy to emit, "I didn't find what I'm looking for under $ELEMENT" while parsing the file, or "I expected a string but I got $SOMETHING at element $ELEMENT".
Maybe I'm distorted because I worked with XML files more than decade, but I never spent more than 30 seconds while debugging an XML parsing process.
Also, this was one of the first parts I "sealed" in the said codebase and never touched it again, because it worked, even if the coming file is badly formed (by erroring out correctly and cleanly).
2 replies →