Comment by ajross
3 months ago
> What is the downside of switching to the newest standard when it's properly supported?
Backwards compatibility. Not all legal old syntax is necessarily legal new syntax[1], so there is the possibility that perfectly valid C++11 code exists in the wild that won't build with a new gcc.
[1] The big one is obviously new keywords[2]. In older C++, it's legal to have a variable named "requires" or "consteval", and now it's not. Obviously these aren't huge problems, but compatibility is important for legacy code, and there is a lot of legacy C++.
[2] Something where C++ and C standards writers have diverged in philosophy. C++ makes breaking changes all the time, where C really doesn't (new keywords are added in an underscored namespace and you have to use new headers to expose them with the official syntax). You can build a 1978 K&R program with "cc" at the command line of a freshly installed Debian Unstable in 2025 and it works[3], which is pretty amazing.
[3] Well, as long as it worked on a VAX. PDP-11 code is obviously likely to break due to word size issues.
well, shouldn't not-up-to-date code use the corresponding compiler flag instead of someone starting a greenfield project, who might then write outdated code?
No? The "corresponding compiler flag" is a new feature. I mean, who told folks at Bell Labs in 1978 how the GCC --std= arguments would work in the coming decades? Legacy code is legacy, it doesn't know it needs to use the correct flags. When it was a greenfield project, it was the default!
Like, think about it: if you think the defaults should be good for greenfield projects, then greenfield projects won't be using the correct flags (because if they are, then the whole argument is specious anyway). And when C++34 shows up, they're going to be broken and we'll have this argument again.
Compatibility is hard. But IMHO C++ and gcc are doing this wrong and C is doing it much better.
GCC's default has already changed once (to C++11). It did not cause any significant problems, and any software which is relying on the current value was created long after the flags to pick a standard version were added.
> C++ makes breaking changes all the time,
Please don't spread misinformation. Breaking changes are actually almost inexistent with C++. The last one was with the COW std::string and std::list ~15 years ago with the big and major switch from C++03 to C++11. And heck, even then GCC wouldn't let your code break because it supported dual ABIs - you could mix C++03 and C++11 code and link them together.
So C++ actually tries really hard _not_ to break your code, and that is the philosophy behind a language adhering to something that is called backwards-compatibility, you know? Something many, such as Google, were opposing to and left the committee/language for that reason. I thank the C++ language for that.
Introducing new features or new keywords or making stricter implementation of existing ones, such as narrowing integral conversions, is not a breaking change.
> Introducing [...] new keywords [...] is not a breaking change.
This is some kind of semantic prestidigitation around a definition for "breaking" that I'm not following. Yes, obviously it is. New keywords were valid symbol names before they were keywords.
Makes me wonder if the "don't spread misinformation" quip was made in good faith.
It was. And no, breaking change is not what you seem to imply it is. When talking about breaking changes, introducing new keywords is not what people usually think of. It's irrelevant.
3 replies →