← Back to context

Comment by pjmlp

6 hours ago

Unfortunely it also means that when the programmer fails to understand what undefined behaviour is exposed on their code, the compiler is free to take advantage of that to do the ultimate performance optimizations as means to beat compiler benchmarks.

The code change might come in something as innocent as a bug fix to the compiler.

Ah yes, the good old "compiler writers only care about benchmarks and are out to hurt everyone else" nonsense.

I for one am glad that compilers can assume that things that can't happen according to the language do in fact not happen and don't bloat my programs with code to handle them.

  • Moral hazard here. The rest of us, and all of society, now rests on a huge pile of code written by incorrigible misers who imagined themselves able to write perfect, bug-free code that would go infinitely fast because bad things never happen. But see, there's bugs in your code and other people pay the cost.

  • > I for one am glad that compilers can assume that things that can't happen according to the language do in fact not happen and don't bloat my programs with code to handle them.

    Yes, unthinkable happenstances like addition on fixed-width integers overflowing! According to the language, signed integers can't overflow, so code like the following:

        int new_offset = current_offset + 16;
        if (new_offset < current_offset)
            return -1; // Addition overflowed, something's wrong
    

    can be optimized to the much leaner

        int new_offset = current_offset + 16;
    

    Well, I sure am glad the compiler helpfully reduced the bloat in my program!