Comment by uecker
18 hours ago
I do not imply bad intentions, but I see the arguments brought forward in WG14. It got better in recent years but we had to push back against some rather absurd interpretations of the standard, e.g. that unspecified values can change after they are selected. Your example shows something else, the standard is simply not seemed to be very important. The standard is perfectly clear how pointer comparison works, and yet this is not alone reason enough to invest resources into fixing this if this is not shown to cause actual problems in real code.
> [... absurd interpretations] unspecified values can change after they are selected
It seems hard to not have this without imputing a freeze semantic which would be expensive on today's systems. Maybe I don't understand what you mean ? Rust considered insisting on the freeze semantic and was brought up short by the cost as I understand it.
I do not see how this adds a substantial cost and it is required for C programs to work correctly and the C standard carefully describes the exact situation where unspecified values are chosen - so the idea that the compiler is then free to break this is clearly in contradiction to the wording. Clang got this wrong and I assume mostly fixed it, because non-frozen values caused a lot inconsistency and other bugs.
Like I said, maybe I'm not understanding which "unspecified values" we're talking about. The freeze semantic is a problem when we've said only that we don't know what value is present (typically one or more mapped but unwritten bytes) and so since we never wrote to this RAM the underpinning machine feels confident to just change what is there. Which means saying "No" isn't on the compiler per se. The OS and machine (if virtual) might be changing this anyway. If you know of Facebook's magic zero string terminator bug, that's the sort of weird symptom you get.
But maybe you're talking about something else entirely?
2 replies →