← Back to context

Comment by jandrewrogers

3 days ago

The stream of modern C++ features have been a god-send for anyone that cares about high-performance, high-reliability software. Maybe that doesn’t apply to your use case but C++ is widely used in critical data infrastructure. For anyone that does care about things like performance and reliability, the changes to modern C++ have largely been obvious and immediately useful improvements. Almost all C++ projects I know in the high-performance data infrastructure space live as close to the bleeding edge of new C++ features as the compiler implementations make feasible.

And no, reflection hasn’t “been solved for years” unless you have a very misleading definition of “solved”. A lot of the C++ code I work with is heavily codegen-ed via metaprogramming. Despite the relative expressiveness and flexibility of C++ metaprogramming, proper reflection will dramatically improve what is practical in a strict and type-safe way at compile-time.

You are sounding like rose tinted glasses are on. I think your glass is half full if you recheck actual versions and features. And mine is half empty in gamedev.

Anecdata: A year or so ago I have been in discussion if beta features of C++20 on platforms are good to be used on large scale. It makes it not a sum but an intersection of partial implementations. Anyway it looked positive until we needed a pilot project to try. One of the projects came back with 'just flipping C++20 switch with no changes causes significant regression on build times'. After confirming it that it is indeed not an error on our side it was kinda obvious. Proportional increase of remote compilation cloud costs for few minor features is a 'no'. After a year the beta support is no longer beta but still partial on platforms and no improvements on build times in community. YMMV of course because gamedev mostly supports closed source platforms with closed set of build tools.

  • > One of the projects came back with 'just flipping C++20 switch with no changes causes significant regression on build times'.

    I think this just proves that your team is highly inexperienced in C++ projects, which you implicitly attest by admitting this was your first C++ upgrade you had to go through.

    Let me be very clear: there is never an upgrade of the C++ version targeted by a project that does not require full regression tests and a few bugs to squash. Why? Because even if the C++ side of things is perfectly fine, libraries often introduce all sorts of unexpected issues.

    For example, once I had to migrate a legacy project to C++14 and flipping the compiler flag to c++14 caused a wall of compiler errors. It turned out the C++ was perfectly fine, but a single library behaved very poorly with a constexpr constructor they enabled conditionally with C++14.

    You should understand that upgrades to the core language and standard libraries are exceptionally stable, and a clear focus of the standardization committee. But they only have a say in how the core language and standard libs should be. The bulk of the code any relatively complex project consumes is not core lang+ stdlib, but third-party libraries and frameworks. These often are riddled with flags to toggle whole components only in specific versions of the C++ language, mainly for backwards compatibility. Once you target a new version of C++, often that means you replace whole components of upstream dependencies. This often requires fixing your code. This happens very frequently, even with the likes of Boost.

    So, what you're complaining about is not C++ but your inexperience in software engineering in general. I mean, what is the rule of thumb about major version upgrades?

    • The reported build time increases can be observed just compiling an empty file with a standard library header included. Your speculation about how the GP is bad at engineering is uncalled for.

    • I am sorry for the confusion. It's fine to have some downvotes if its not what ppl like to see. I was not complaining. Message was purely informational from single point of view that a) game platforms have only partial C++20 support in 2025. b) there are features that are in C++ standard that do not fit description 'god-send'.

  • > Proportional increase of remote compilation cloud costs for few minor features is a 'no'.

    How high are those compilation costs compared the developer time that might be saved with even minor features?

    • Tbh I dont have exact numbers from 2024 at hand. I remember that decision was unanimous. A build times increase is a very sensitive topic for us in gamedev.

  • > One of the projects came back with 'just flipping C++20 switch with no changes causes significant regression on build times

    Given that C++20 introduced modules, which are intended to make builds faster, I think just flipping C++20 switch with no changes and checking build times should not be the end of checking whether C++20 is worth it for your setup.

    • > Given that C++20 introduced modules, which are intended to make builds faster

      Turning on modules effectively requires that all of your project dependencies themselves have turned on modules. Fail to do so, and a lot of the benefits start to become hindrances (Clang is currently debating going to 64-bit source locations because modularizing in this manner tends to exhaust the current 32-bit source locations).

I still have to learn C++20 concepts and now we have a full-fledged reflection system?

Good, but I think what happens is there are people on the bleeding edge of C++, usually writing libraries that ship with new code. Each new feature is a godsend for them -- it's the reason why the features are proposed in the first place. It allows you to write libraries more simply, more generally, more safely, and more efficiently.

The rest of us are dealing with old code that is a hodgepodge of older standards and toolchains, that has to run in multiple environments, mostly old ones. It's like yeah, this C++26 feature will come in handy for me someday, but if that day comes then it will be in 2036, and I might not be writing C++ by then.

  • >The rest of us are dealing with old code that is a hodgepodge of older standards and toolchains, that has to run in multiple environments, mostly old ones. It's like yeah, this C++26 feature will come in handy for me someday, but if that day comes then it will be in 2036, and I might not be writing C++ by then.

    Things seem to be catching up. I had the same view up until recently, but now I'm able to use most of the C++23 features in an embedded platform (granted, some are still missing (limited to GCC 11.2).

[flagged]

  • You sound like you subscribe to "Orthodox C++".

    Speaking seriously, I agree there's definitely a lot of bloat in the new C++ standards. E.g. I'm not a fan of the C++26 linalg stuff. But most performance-focused trading firms still use the latest standard with the latest compiler. Just a small example of new C++ features that are used every day in those firms:

    Smart pointers (C++11), Constexpr and consteval (all improvements since C++11), Concepts (C++20), Spans (C++20), Optional (C++17), String views (C++17)

    • > I'm not a fan of the C++26 linalg stuff.

      I don't agree at all. For most, linear algebra is the primary reason they pick up C++. Up until now, the best option C++ newbies had was to go through arcane processes to onboard a high performance BLAS implementation which then requires even more arcane steps such as tuning.

      With C++26, anyone can simply jump into implementing algorithms.

      If anything, BLAS support was conspicuously missing from C++ (and also C).

      This blend of comments is more perplexing given that a frequent criticism of C++ is its spartan standard lib, and how the selling point of some commercial software projects such as Matlab is that, unlike C++, linear algebra work is trivial.

      8 replies →