Comment by tw061023
3 days ago
It's the other way around. You are the real programmer and the committee and the "modern C++" crowd are more interested playing with legos instead of shipping actual software.
No way anything std::meta gets into serious production; too flexible in some ways, too inflexible in others, too much unpredictability, too high impact on compilation times - just like always with newer additions to the C++ standard. It takes one look at coding standards of real-world projects to see how irrelevant this stuff is.
And like always, the problem std::meta is purported to solve has been solved for years.
The stream of modern C++ features have been a god-send for anyone that cares about high-performance, high-reliability software. Maybe that doesn’t apply to your use case but C++ is widely used in critical data infrastructure. For anyone that does care about things like performance and reliability, the changes to modern C++ have largely been obvious and immediately useful improvements. Almost all C++ projects I know in the high-performance data infrastructure space live as close to the bleeding edge of new C++ features as the compiler implementations make feasible.
And no, reflection hasn’t “been solved for years” unless you have a very misleading definition of “solved”. A lot of the C++ code I work with is heavily codegen-ed via metaprogramming. Despite the relative expressiveness and flexibility of C++ metaprogramming, proper reflection will dramatically improve what is practical in a strict and type-safe way at compile-time.
You are sounding like rose tinted glasses are on. I think your glass is half full if you recheck actual versions and features. And mine is half empty in gamedev.
Anecdata: A year or so ago I have been in discussion if beta features of C++20 on platforms are good to be used on large scale. It makes it not a sum but an intersection of partial implementations. Anyway it looked positive until we needed a pilot project to try. One of the projects came back with 'just flipping C++20 switch with no changes causes significant regression on build times'. After confirming it that it is indeed not an error on our side it was kinda obvious. Proportional increase of remote compilation cloud costs for few minor features is a 'no'. After a year the beta support is no longer beta but still partial on platforms and no improvements on build times in community. YMMV of course because gamedev mostly supports closed source platforms with closed set of build tools.
> One of the projects came back with 'just flipping C++20 switch with no changes causes significant regression on build times'.
I think this just proves that your team is highly inexperienced in C++ projects, which you implicitly attest by admitting this was your first C++ upgrade you had to go through.
Let me be very clear: there is never an upgrade of the C++ version targeted by a project that does not require full regression tests and a few bugs to squash. Why? Because even if the C++ side of things is perfectly fine, libraries often introduce all sorts of unexpected issues.
For example, once I had to migrate a legacy project to C++14 and flipping the compiler flag to c++14 caused a wall of compiler errors. It turned out the C++ was perfectly fine, but a single library behaved very poorly with a constexpr constructor they enabled conditionally with C++14.
You should understand that upgrades to the core language and standard libraries are exceptionally stable, and a clear focus of the standardization committee. But they only have a say in how the core language and standard libs should be. The bulk of the code any relatively complex project consumes is not core lang+ stdlib, but third-party libraries and frameworks. These often are riddled with flags to toggle whole components only in specific versions of the C++ language, mainly for backwards compatibility. Once you target a new version of C++, often that means you replace whole components of upstream dependencies. This often requires fixing your code. This happens very frequently, even with the likes of Boost.
So, what you're complaining about is not C++ but your inexperience in software engineering in general. I mean, what is the rule of thumb about major version upgrades?
2 replies →
> Proportional increase of remote compilation cloud costs for few minor features is a 'no'.
How high are those compilation costs compared the developer time that might be saved with even minor features?
1 reply →
> One of the projects came back with 'just flipping C++20 switch with no changes causes significant regression on build times
Given that C++20 introduced modules, which are intended to make builds faster, I think just flipping C++20 switch with no changes and checking build times should not be the end of checking whether C++20 is worth it for your setup.
1 reply →
I still have to learn C++20 concepts and now we have a full-fledged reflection system?
Good, but I think what happens is there are people on the bleeding edge of C++, usually writing libraries that ship with new code. Each new feature is a godsend for them -- it's the reason why the features are proposed in the first place. It allows you to write libraries more simply, more generally, more safely, and more efficiently.
The rest of us are dealing with old code that is a hodgepodge of older standards and toolchains, that has to run in multiple environments, mostly old ones. It's like yeah, this C++26 feature will come in handy for me someday, but if that day comes then it will be in 2036, and I might not be writing C++ by then.
>The rest of us are dealing with old code that is a hodgepodge of older standards and toolchains, that has to run in multiple environments, mostly old ones. It's like yeah, this C++26 feature will come in handy for me someday, but if that day comes then it will be in 2036, and I might not be writing C++ by then.
Things seem to be catching up. I had the same view up until recently, but now I'm able to use most of the C++23 features in an embedded platform (granted, some are still missing (limited to GCC 11.2).
I am interested; could you provide some links, articles, etc?
[flagged]
You sound like you subscribe to "Orthodox C++".
Speaking seriously, I agree there's definitely a lot of bloat in the new C++ standards. E.g. I'm not a fan of the C++26 linalg stuff. But most performance-focused trading firms still use the latest standard with the latest compiler. Just a small example of new C++ features that are used every day in those firms:
Smart pointers (C++11), Constexpr and consteval (all improvements since C++11), Concepts (C++20), Spans (C++20), Optional (C++17), String views (C++17)
9 replies →
Prediction: it will be used heavily for things like command line arg parsing, configuration files, deserialization, reflection into other languages. It will probably be somewhat a pain to use, but better than the current alternative mashup of macros/codegen/template metaprogramming that we have now for some of these solutions. It will likely mostly be used for library code, where someone defines some nice utilities for you, that do something useful, so that you don't have to worry about it. I don't think for the most part it has to hurt compile times - it might even be faster than the current mess, as well - less use of templates.
I don't think the "legos" vs "shipping" debate here is really valid. One can write any type of code in any language. I'm a freak about C++, but if someone wants to ship in Python or JS, the more power to them - one can write code that's fast enough to not matter, but takes advantage of those languages' special features.
I know the trading firm I work at will be making heavy use of reflection the second it lands… we had a literal party when it made it into the standard.
sure, but instagram was created by a handful of people with python and got a billion dollar exit in 2012.
What has that to do with the topic? Warren Buffet made billions without any do knowledge about programming or deeper knowledge about computers.
1 reply →
> sure, but instagram was created by a handful of people with python and got a billion dollar exit in 2012.
Facebook famously felt compelled to hire eminent C++ experts to help them migrate away from their PHP backend. I still recall reading posts on the Instagram Engineering blog on how and where they used C++.
3 replies →
What is this culture of judging everything by amount of money.
No one needs a billion dollars, it is practically irrelevant unless you are running on greed
2 replies →
And Youtube used Python almost exclusively at the start AFAIK.
Then again Scott Meyers said he's never written a C++ program professionally.
1 reply →
> And like always, the problem std::meta is purported to solve has been solved for years.
It is rare to read something more moronic than that
The Rust equivalent of std::meta (procedural macros) are heavily used everywhere including in serialization framework, debugging and tracers.
And that's not surprising at all: Compile time introspection is much more powerful and lightweight than codegen for exactly the same usage.
> It is rare to read something more moronic than that
It's not actually wrong though is it - real codebases have been implementing reflection and introspection through macro magic etc. for decades at this point.
I guess it's cool they want to fix it in the language, but as always, the approach is to make the language even more complex than it already is - e.g. two new operators (!) in the linked article
> been implementing reflection and introspection through macro magic etc. for decades at this point.
Having a flaky pile of junk as an alternative is never been an excuse to not fix the problem properly.
Every proper modern language (Rust, Kotlin, Zig, Swift, even freaking Golang) has a form of runtime reflection or static introspection.
Only C++ does not. It was done historically with a mess of macros or a pre-compiler (qt-moc) that all have an entire pile of issue.
> the approach is to make the language even more complex than it already is - e.g. two new operators
The problem of rampant complexity in C++ is not so much about the new features when they bring something and make sense.
It is about its inability to remove the old stuff even if it is consensual that it is garbage (e.g iostreams).
1 reply →
I embrace Modern C++, but slower than the committee, when the big three have the feature.
I really think reflection + annotations will give us the chance to have much better serialization and probably something more similar to Python decorators.
That will be plenty useful and it is going to transform a part of C++ ecosystem, for example I am thinking of editors that need to reflect on data structures or web frameworks such as Crow or Drogon, Database access libraries...
I bet CERN might eventually replace their Python based code generators with C++26 reflection.
Which problem would this solve for them?
It would standardize something they've done in an ad-hoc way for decades. They have a library called "reflex" [1] which adds some reflection, and which was (re)written by cannibalizing a lot of llvm code. They actually use the reflection to serialize a lot of the LHC data.
It's kind of neat that it works. It's also a bit fidgety: the cannibalized code can cause issues (which, e.g. prevented C++11 adoption for a while in some experiments), and now CERN depends on bits of an old C++ compiler to read their data. Some may question the wisdom of making a multi-billion dollar dataset without a spec and dependent on internals of C++ classes (indeed experiments are slowly moving to formats with a clear spec), but for sure having a standard for reflection is better than the home-grown solution they rely on now.
[1]: https://indico.cern.ch/event/408139/contributions/979831/att...
2 replies →
Two language problem, kind of well known issue in engineering tradeoffs.
1 reply →
> No way anything std::meta gets into serious production
Rust proc macros get used in serious production, even though they're quite slow to compile. Sure, std::meta is probably a bit clunkier, but that's expected from new C++ features as you say.
Sadly, Rust proc macros operate on tokens and any serious macro implementation needs third-party crates.
Compile-time reflection, with good, built in API, akin to C# Roslyn would be a real boon.
Any serious anything needs third party crates. Coming from c++ this has been the most uncomfortable aspect of rust to me, but I am acclimating.
Every problem is solved. We should stop making anything. Specially CRUD apps, because how is that even programming? What does it solve that hasn't been solved?
This line of thinking is not productive. It is a mistake to see yourself as what you do, because then you're cornering yourself into defending it, no matter what.
> the problem std::meta is purported to solve has been solved for years.
What solution is that? A Python script that spits out C++ code?
Yeah, wait till you find out what's behind the curtain in your web engine and AI.
Hint: it's C++, and yes, it will eventually use stuff like std::meta heavily.
If you would check my comments, you would see I am quite aware. And no, it will not, just like it was with streams, ranges and whatever else.
What's the solution that's been around for years?
> ... just like always with newer additions to the C++ standard.
This is objectively laughable.
I was literally running into something a couple of days ago on my toy C++ project where basic compile-time reflection would have been nice to have for some sanity checking.
And even if it's true that some things can be done already with specific compilers and implementation-specific hacks, it would be really nice to be able to do those things more straightforwardly.
My experience with C++ changes has been that the recent additions to compile-time metaprogramming operations is that they improve compile times rather than make it worse, because you don't have to do things like std::enable_if<> hacks and recursive templates to do things that a simple generic lambda or constexpr conditional will do, which are more difficult for both you and the compiler.
Constexpr if and fold expressions have been a god send!
The history of C++ has been one long loop of:
1. So many necessary common practices of C++ are far too complicated!
2. Std committee adds features to make those practices simpler.
3. C++ keeps adding features. It’s too big. They should cut out the old stuff!
4. The std committee points at the decade-long Python 3 fiasco.
5. Repeat.
2 replies →
> What's the solution that's been around for years?
Build tools to generate C++ code from some other tool. Interface description languages, for example, or something like (going back decades here) lex and yacc even.
Great. But you can do anything you want by generating code. Why not have a standard solution instead of everyone doing their own, possibly buggy thing complicating their build process even more?
5 replies →
Debugging/modifying code generated from someone's undocumented c++ code generator is pretty close to the top of my list of unpleasant things to do. Yes, you can eventually figure out what to do by looking at the generated code and taking apart the code generator and figuring out how it all works but I'll take built-in language features any day.
I've been down this road. I ended up with a config YAML (basically - an IDL) that goes into a pile of jinja files and C++ templates - and it always ended up better and easier to read to minimize the amount of jinja (broken syntax highlighting, the fact that you are writing meta meta code, it's a hot mess). I'd much prefer to generate some bare structs with some minimal additional inline metadata than to generate both those structs and an entire separate set of structs describing the first ones. std::meta lets me do the former, the latter is what's possible right now.
For example, boost library's "describe" and similar macro based solutions. Been using this for many years.
Whip up some kind of in-house IDL/DDL parser, codegen from that.
Which, precisely, additions do not fit my points?
Completely inadequate for many use cases. IDL/DDL is one of the least interesting things you could do with reflection in C++. You can already do a lot of that kind of thing with existing metaprogramming facilities.
1 reply →
Most of the time, I will prefer standard C++ over a full hand made layer of complexity that needs maintenance.
> It's the other way around. You are the real programmer and the committee and the "modern C++" crowd are more interested playing with legos instead of shipping actual software.
I think this is the most clueless comment I ever read in HN. I hope the site is not being hit with it's blend of September.
I was going to explain to you how fundamentally wrong your comment was, but it's better to just kindly ask you to post in Reddit instead.