Comment by amelius

5 days ago

I hope it doesn't get stuck at 3.14, like TeX.

https://www.reddit.com/r/RedditDayOf/comments/7we430/donald_...

You hope it doesn't?

> [Donald Knuth] firmly believes that having an unchanged system that will produce the same output now and in the future is more important than introducing new features

This is such a breath of fresh air in a world where everything is considered obsolete after like 3 years. Our industry has a disease, an insatiable hunger for newness over completeness or correctness.

There's no reason we can't be writing code that lasts 100 years. Code is just math. Imagine having this attitude with math: "LOL loser you still use polynomials!? Weren't those invented like thousands of years ago? LOL dude get with the times, everyone uses Equately for their equations now. It was made by 3 interns at Facebook, so it's pretty much the new hotness." No, I don't think I will use "Equately", I think I'll stick to the tried-and-true idea that has been around for 3000 years.

Forget new versions of everything all the time. The people who can write code that doesn't need to change might be the only people who are really contributing to this industry.

  • > There's no reason we can't be writing code that lasts 100 years. Code is just math.

    In theory, yes. In practice, no, because code is not just math, it's math written in a language with an implementation designed to target specific computing hardware, and computing hardware keeps changing. You could have the complete source code of software written 70 years ago, and at best you would need to write new code to emulate the hardware, and at worst you're SOL.

    Software will only stop rotting when hardware stops changing, forever. Programs that refuse to update to take advantage of new hardware are killed by programs that do.

    • This is a total red herring, x86 has over 30 years of backwards compatability and the same goes for the basic peripherals.

      The real reason for software churn isn't hardware churn, but hardware expansion. It's well known that software expands to use all available hardware resources (or even more, according to Wirth's law).

      22 replies →

    • The bare minimum cost of software churn is the effort of one human being, which is far less than hardware churn (multiple layers of costly design and manufacturing). As a result, we see hardware change gradually over the years, while software projects can arbitrarily deprecate, change, or remove anything at a whim. The dizzying number of JS frameworks, the replacement of X with Wayland or init with systemd, removal of python stdlib modules, etc. etc. have nothing to do with new additions to the x86 instruction set.

    • > and computing hardware keeps changing.

      Only if you can't reasonably buy a direct replacement. That might have been a bigger problem in the early days of computing where people spread themselves around, leaving a lot of business failures and thus defunct hardware, but nowadays we all usually settle on common architectures that are very likely to still be around in the distant future due to that mass adoption still providing strong incentive for someone to keep producing it.

    • TeX is written in a literate programming style which is more akin to a math textbook than ordinary computer code, except with code blocks instead of equations. The actual programming language in the code blocks and the OS it runs on matters a lot less than in usual code where at best you get a few sparse comments. Avoiding bit rot in such a program is a very manageable task. In fact, iirc the code blocks which end up getting compiled and executed for TeX have been ported from Pascal to C at some point without introducing any new bugs.

      1 reply →

    • This is correct when it comes to bare metal execution.

      You can always run code from any time with emulation, which gives the “math” the inputs it was made to handle.

      Here’s a site with a ton of emulators that run in browser. You can accurately emulate some truly ancient stuff.

      https://www.pcjs.org/

  • Are you by chance a Common Lisp developer? If not, you may like it (well, judging only by your praise of stability).

    Completely sidestepping any debate about the language design, ease of use, quality of the standard library, size of community, etc... one of its strengths these days is that standard code basically remains functional "indefinitely", since the standard is effectively frozen. Of course, this requires implementation support, but there are lots of actively maintained and even newer options popping up.

    And because extensibility is baked into the standard, the language (or its usage) can "evolve" through libraries in a backwards compatible way, at least a little more so than many other languages (e.g. syntax and object system extension; notable example: Coalton).

    Of course there are caveats (like true, performant async programming) and it seems to be a fairly polarizing language in both directions; "best thing since sliced bread!" and "how massively overrated and annoying to use!". But it seems to fit your description decently at least among the software I use or know of.

    • I respect and understand the appeal of LISP. It is a great example of code not having to change all the time. I personally haven't had a compelling reason to use it (post college), but I'm glad I learned it and I wouldn't be averse to taking a job that required it.

      While writing "timeless" code is certainly an ideal of mine, it also competes with the ideals of writing useful code that does useful things for my employer or the goals of my hobby project, and I'm not sure "getting actual useful things done" is necessarily LISP's strong suit, although I'm sure I'm ruffling feathers by saying so. I like more modern programming languages for other reasons, but their propensity to make backward-incompatible changes is definitely a point of frustration for me. Languages improving in backward-compatible ways is generally a good thing; your code can still be relatively "timeless" in such an environment. Some languages walk this line better than others.

      1 reply →

  • Stability is for sure a very seducing trait. Also I can totally understand the fatigue of the chase for the next almost already obsolete new stuff.

    >There's no reason we can't be writing code that lasts 100 years.

    There are many reason this is most likely not going to happen. Code despite best effort to achieve separation of concern (in the best case) is a highly contextual piece of work. Even with a simple program with no external library, there is a full compiler/interpreter ecosystem that forms a huge dependency. And hardware platforms they abstract from are also moving target. Change is the only constant, as we say.

    >Imagine having this attitude with math: "LOL loser you still use polynomials!? Weren't those invented like thousands of years ago?

    Well, that might surprise you, but no, they weren't. At least, they were not dealt with as they are thought and understood today in their contemporary most common presentation. When Babylonians (c. 2000 BCE) solved quadratic equation, they didn't have anything near Descartes algebraic notation connected to geometry, and there is a long series evolution in between, and still to this days.

    Mathematicians actually do make a lot of fancy innovative things all the time. Some fundamentals stay stable over millennia, yes. But also some problem stay unsolved for millennia until some outrageous move is done out of the standard.

    • Don't know about 100 years, but old static web page from lat 90's with js on wayback machine still works. There might be something to this static html css to archive content maybe even little programs.

      1 reply →

  • To be fair, if math did have version numbers, we could abandon a lot of hideous notational cruft / symbol overloading, and use tau instead of pi. Math notation is arguably considerably worse than perl -- can you imagine if perl practically required a convention of single-letter variable names everywhere? What modern language designer would make it so placing two variable names right next to each other denotes multiplication? Sheer insanity.

    Consider how vastly more accessible programming has become from 1950 until the present. Imagine if math had undergone a similar transition.

    • Math personally "clicked" to me when I started to use Python and R for mathematical operations instead of the conventional arcane notation. I did make me wonder why we insist on forcing kids and young adults to struggle through particularly counter-intuitive ways to express mathematical concepts just because of historical baggage, and I am glad to hear now that I am not the only one who thinks this way.

    • What in the Hacker News in this comment?

      Mathematical notation evolved to its modern state over centuries. It's optimized heavily for its purpose. Version numbers? You're being facetious, right?

      6 replies →

    • If the compiler forbade syntactic ambiguity from implicit multiplication and had a sensible LSP allowing it to be rendered nicely, I don't think that'd be such a bad thing. Depending on the task at hand you might prefer composition or some other operation, but when reducing character count allows the pattern recognition part of our brain to see the actual structure at hand instead of wading through character soup it makes understanding code much easier.

      1 reply →

  • > There's no reason we can't be writing code that lasts 100 years. Code is just math. Imagine having this attitude with math: "LOL loser you still use polynomials!? Weren't those invented like thousands of years ago? LOL dude get with the times, everyone uses Equately for their equations now. It was made by 3 interns at Facebook, so it's pretty much the new hotness." No, I don't think I will use "Equately", I think I'll stick to the tried-and-true idea that has been around for 3000 years.

    Not sure this is the best example. Mathematical notation evolved a lot in the last thousand years. We're not using roman numerals anymore, and the invention of 0 or of the equal sign were incredible new features.

    • > Mathematical notation evolved a lot in the last thousand years

      That is not counter to what I'm saying.

          Mathematical notation <=> Programming Languages.
      
          Proofs <=> Code.
      

      When mathematical notation evolves, old proofs do not become obsolete! There is no analogy to a "breaking change" in math. The closest we came to this was Godel's Incompleteness Theorem and the Cambrian Explosion of new sets of axioms, but with a lot of work most of math was "re-founded" on a set of commonly accepted axioms. We can see how hostile the mathematical community is to "breaking changes" by seeing the level of crisis the Incompleteness Theorem caused.

      You are certainly free to use a different set of axioms than ZF(C), but you need to be very careful about which proofs you rely on; just as you are free to use a very different programming language or programming paradigm, but you may be limited in the libraries available to you. But if you wake up one morning and your code no longer compiles, that is the analogy to one day mathematicians waking up and realizing that a previously correct proof is now suddenly incorrect -- not that it was always wrong, but that changes in math forced it into incorrectness. It's rather unthinkable.

      Of course programming languages should improve, diversify, and change over time as we learn more. Backward-compatible changes do not violate my principle at all. However, when we are faced with a possible breaking change to a programming language, we should think very hard about whether we're changing the original intent and paradigms of the programming language and whether we're better off basically making a new spinoff language or something similar. I understand why it's annoying that Python 2.7 is around, but I also understand why it'd be so much more annoying if it weren't.

      Surely our industry could improve dramatically in this area if it cared to. Can we write a family of nested programming languages where core features are guaranteed not to change in breaking ways, and you take on progressively more risk as you use features more to the "outside" of the language? Can we get better at formalizing which language features we're relying on? Better at isolating and versioning our language changes? Better at time-hardening our code? I promise you there's a ton of fruitful work in this area, and my claim is that that would be very good for the long-term health and maturation of our discipline.

      3 replies →

  • > There's no reason we can't be writing code that lasts 100 years. Code is just math

    Math is continually updated, clarified and rewritten. 100 years ago was before the Bourbaki group.

    • > Math is continually updated, clarified and rewritten

      And yet math proofs from decades and centuries ago are still correct. Note that I said we write "code that lasts", not "programming languages that never change". Math notation is to programming languages as proofs are to code. I am not saying programming languages should never change or improve. I am saying that our entire industry would benefit if we stopped to think about how to write code that remains "correct" (compiling, running, correct behavior) for the next 100 years. Programming languages are free to change in backward-compatible ways, as long once-correct code is always-correct. And it doesn't have to be all code, but you know what they say: there is nothing as permanent as a temporary solution.

  • > an insatiable hunger for newness over completeness or correctness.

    I understand some of your frustration, but often the newness is in response to a need for completeness or correctness. "As we've explored how to use the system, we've found some parts were missing/bad and would be better with [new thing]". That's certainly what's happening with Python.

    It's like the Incompleteness Theorem, but applied to software systems.

    It takes a strong will to say "no, the system is Done, warts and missing pieces and all. Deal With It". Everyone who's had to deal with TeX at any serious level can point to the downsides of that.

  • If you look at old math treatises from important historical people you'll notice that they use very different notation from the one you're used to. Commonly concepts are also different, because those we use are derived over centuries from material produced without them and in a context where it was traditional to use other concepts to suss out conclusions.

    But you have a point, and it's not just "our industry", it's society at large that has abandoned the old in favour of incessant forgetfulness and distaste for tradition and history. I'm by no means a nostalgic but I still mourn the harsh disjoint between contemporary human discourse and historical. Some nerds still read Homer and Cicero and Goethe and Ovid and so on but if you use a trope from any of those that would have been easily recognisable as such by europeans for much of the last millenium you can be quite sure that it won't generally be recognised today.

    This also means that a lot of early and mid-modern literature is partially unavailable to contemporary people, because it was traditional to implicitly use much older motifs and riff on them when writing novels and making arguments, and unless you're aware of that older material you'll miss out on it. For e.g. Don Quixote most would need an annotated version which points out and makes explicit all the references and riffing, basically destroying the jokes by explaining them upfront.

  • Worth noting that few people use the TeX executable as specified by Knuth. Even putting aside the shift to pdf instead of dvi output, LaTeX requires an extended TeX executable with features not part of the Knuth specification from 1988.

    Btw, equations and polynomials while conceptually are old, our contemporary notation is much younger, dating to the 16th century, and many aspects of mathematical notation are younger still.

  • This philosophy may have its place in some communities, but Python is definitely not one of them.

    Even C/C++ introduces breaking changes from time to time (after decades of deprecation though).

    There’s no practical reason why Python should commit to a 100+ year code stability, as all that comes at a price.

    Having said that, Python 2 -> 3 is a textbook example of how not to do these things.

    • Python is pretty much on the other extreme as 3.x → 3.y should be expected to break things, there's no "compability mode" to not break things, and the reasons for the breakage can be purely aestetic bikeshedding

      C in contrast generally versions the breaking changes in the standard, and you can keep targeting an older standard on a newer compiler if you need to, and many do

  • While i think Latex is fantastic, i think there is plenty of low hanging fruit to improve upon it... the ergonomics of the language and its macros aren't great. If nothing else there should be a better investment in tooling and ecosystem.

  • Mathematical notion has changed over the years. Is Diophantus' original system of polynomials that legible to modern mathematicians? (Even if you ignore the literally being written in ancient greek part.)

  • I agree somewhat with your sentiment and have some nostalgia for a time when software could be finished, but the comment you're replying to was making a joke that I think you may have missed.

  • > There's no reason we can't be writing code that lasts 100 years. Code is just math.

    The weather forecast is also “just math”, yet yesterday’s won’t be terribly useful next April.

    • No, weather forecasting models are "just math". The forecast itself is an output of the model. I sure hope our weather forecasting models are still useful next year!

          weather forecasting models <=> code <=> math
      
          weather forecast <=> program output <=> calculation results
      

      So all you're saying is that we should not expect individual weather forecasts, program output, and calculation results to be useful long-term. Nobody is arguing that.

      1 reply →

  • My C++ from 2005 still compiles! (I used boost 1.32)

    Most of my python from that era also works (python 3.1)

    The problem is not really the language syntax, but how libraries change a lot.

  • Kinda related question, but is code really just a math? Is it possible to express things like user input, timings, inteerupts, error handling, etc. as math?

    • I would slightly sort of disagree that code is just math when you really boil it down, however, if you take a simple task, say, printing hello world to the output, you could actually break that down into a mathematical process. You can mathematically say at time T value of O will be the value of index N of input X, so over a period of time you eventually get "hello world" as the final output

      Howeveeerrr.. its not quite math when you break down to the electronics level, unless you go really wild (wild meaning physics math). take a breakdown of python to assembly to binary that flips the transistors doing the thing. You can mathematically define that each transistor will be Y when that value of O is X(N); btw sorry i can't think of a better way to define such a thing from mobile here. And go further by defining voltages to be applied, when and where, all mathematically.

      In reality its done in sections. At the electronic level math defines your frequency, voltage levels, timing, etc; at the assembly level, math defines what comparisons of values to be made or what address to shift a value to and how to determine your output; lastly your interpreter determines what assembly to use based on the operations you give it, and based on those assembly operations, ex an "if A == B then C" statement in code is actually a binary comparator that checks if the value at address A is the same as the value at address B.

      You can get through a whole stack with math, but much of it has been abstracted away into easy building blocks that don't require solving a huge math equation in order to actually display something.

      You can even find mathematical data among datasheets for electronic components. They say (for example) over period T you cant exceed V volts or W watts, or to trigger a high value you need voltage V for period T but it cannot exceed current I. You can define all of your components and operations as an equation, but i dont think its really done anymore as a practice, the complexity level of doing so (as someone not building a cpu or any ic) isnt useful unless youre working on a physics paper or quantum computing, etc etc

    • Isn’t it possible to express anything as math? With sufficient effort that is.

  • > This is such a breath of fresh air in a world where everything is considered obsolete after like 3 years.

    I dunno man, there's an equal amount of bullshit that still exists only because that's how it was before we were born.

    > Code is just math.

    What?? No. If it was there'd never be any bugs.

    • > > Code is just math.

      > What?? No. If it was there'd never be any bugs.

      Are you claiming there is no incorrect math out there? Go offer to grade some high-school algebra tests if you'd like to see buggy math. Or Google for amateur proofs of the Collatz Conjecture. Math is just extremely high (if not all the way) on the side of "if it compiles, it is correct", with the caveat that compilation only can happen in the brains of other mathematicians.

      1 reply →

  • Except uh, nobody uses infinitesimals for derivatives anymore, they all use limits now. There's still some cruft left over from the infinitesimal era, like this dx and dy business, but that's just a backwards compatibility layer.

    Anyhoo, remarks like this are why the real ones use Typst now. TeX and family are stagnant, difficult to use, difficult to integrate into modern workflows, and not written in Rust.

    • > the real ones use Typst now

      Are you intentionally leaning into the exact caricature I'm referring to? "Real programmers only use Typstly, because it's the newest!". The website title for Typst when I Googled it literally says "The new foundation for documents". Its entire appeal is that it's new? Thank you for giving me such a perfect example of the symptom I'm talking about.

      > TeX and family are stagnant, difficult to use, difficult to integrate into modern workflows, and not written in Rust.

      You've listed two real issues (difficult to use, difficult to integrate), and two rooted firmly in recency bias (stagnant, not written in Rust). If you can find a typesetting library that is demonstrably better in the ways you care about, great! That is not an argument that TeX itself should change. Healthy competition is great! Addiction to change and newness is not.

      > nobody uses infinitesimals for derivatives anymore, they all use limits now

      My point is not that math never changes -- it should, and does. However, math does not simply rot over time, like code seems to (or at least we simply assume it does). Math does not age out. If a math technique becomes obsolete, it's only ever because it was replaced with something better. More often, it forks into multiple different techniques that are useful for different purposes. This is all wonderful, and we can celebrate when this happens in software engineering too.

      I also think your example is a bit more about math pedagogy than research -- infinitesimals are absolutely used all the time in math research (see Nonstandard Analysis), but it's true that Calculus 1 courses have moved toward placing limits as the central idea.

      4 replies →

    • Even if Typst was going to replace TeX everywhere right now, about half a century would still be a respectable lifespan for a software project.

    • > nobody uses infinitesimals for derivatives anymore

      All auto-differentiation libraries today are built off of infinitesimals via Dual numbers. Literally state of the art.