Comment by orbital-decay

3 years ago

This is too shortsighted by the archival standards. Even Word itself doesn't offer full compatibility. VB? 3rd party active components? Other Office software integration? It's a mess. HTML and other web formats are only readable by the virtue of being constantly evolved while keeping the backwards compatibility, which is nowhere near complete and is hardware-dependent (e.g. aspect ratios, colors, pixel densities). The standards will be pruned sooner or later, due to the tech debt or being sidestepped by something else. And I'm pretty sure there are plenty of obscure PDF features that will prevent many documents from being readable in mere half a century. I'm not even starting on the code and binaries. And cloud storage is simply extremely volatile by nature.

Even 50 years (laughable for a clay tablet) is still pretty darn long in the tech world. We'll still probably see the entire computing landscape, including the underlying hardware, changing fundamentally in 50 years.

Future-proofing anything is a completely different dimension. You have to provide the independent way to bootstrap, without relying on the unbroken chain of software standards, business/legal entities, and the public demand in certain hardware platforms/architectures. This is unfeasible for the vast majority of knowledge/artifacts, so you also have to have a good mechanism to separate signal from noise and to transform volatile formats like JPEG or machine-executable code into more or less future proof representations, at least basic descriptions of what the notable thing did and what impact it had.

>Future-proofing anything is a completely different dimension. You have to provide the independent way to bootstrap, without relying on the unbroken chain of software standards, business/legal entities, and the public demand in certain hardware platforms/architectures. This is unfeasible for the vast majority of knowledge/artifacts, so you also have to have a good mechanism to separate signal from noise and to transform volatile formats like JPEG or machine-executable code into more or less future proof representations, at least basic descriptions of what the notable thing did and what impact it had.

I'd argue that best way would be to not do that but to make sure format is ubiquitous enough that the knowledge will never be lost in the first place.

  • That, and use formats which can be accessed and explained concisely, like "read the first X bytes to metadata field A, then read the image payload by interpreting every three bytes as an RGB triplet until EOF" so that the information can be transmitted orally, in the off chance that becomes necessary

    Hey I think I just described Windows 3.0-era PCX format :P

> HTML and other web formats are only readable by the virtue of being constantly evolved while keeping the backwards compatibility, which is nowhere near complete and is hardware-dependent (e.g. aspect ratios, colors, pixel densities).

HTML itself is relatively safe, by virtue of it being based on SGML. Though it's not ideal either because those who think it's their job to evolve HTML don't bother to maintain SGML DTDs or use other long established formal methods to keep HTML readable, but believe a hard-coded and (hence necessarily) erroneous and incomplete parsing description the size of a phone book is the right tool for the job.

Let me quote the late Yuri Rubinski's foreword to The SGML Handbook outlining the purpose of markup languages (from 1990):

> The next five years will see a revolution in computing. Users will no longer have to work at every computer task as if they had no need to share data with all their other computer tasks, they will not need to act as if the computer is simply a replacement for paper, nor will they have to appease computers or software programs that seem to be at war with one another.

However, exactly because evolving markup vocabularies requires organizing consensus, a task which W3C et al seemingly weren't up to (busy with XML, XHTML, WS-Star, and RDF-Star instead for over a decade), CSS and JS was invented and extended for the absurd purpose of basically redefining what's in the markup which itself didn't need to change, with absolute disastrous results for long-term readability or even readability on browsers other than from the browser cartel today.

  • > Though it's not ideal either because those who think it's their job to evolve HTML don't bother to maintain SGML DTDs or use other long established formal methods to keep HTML readable, but believe a hard-coded and (hence necessarily) erroneous and incomplete parsing description the size of a phone book is the right tool for the job.

    > a task which W3C et al seemingly weren't up to (busy with XML, XHTML

    You realise XML/XHTML is actually delightfully simple to parse compared to WHATWG HTML?