Speaking seriously, I agree there's definitely a lot of bloat in the new C++ standards. E.g. I'm not a fan of the C++26 linalg stuff. But most performance-focused trading firms still use the latest standard with the latest compiler. Just a small example of new C++ features that are used every day in those firms:
Smart pointers (C++11), Constexpr and consteval (all improvements since C++11), Concepts (C++20), Spans (C++20), Optional (C++17), String views (C++17)
I don't agree at all. For most, linear algebra is the primary reason they pick up C++. Up until now, the best option C++ newbies had was to go through arcane processes to onboard a high performance BLAS implementation which then requires even more arcane steps such as tuning.
With C++26, anyone can simply jump into implementing algorithms.
If anything, BLAS support was conspicuously missing from C++ (and also C).
This blend of comments is more perplexing given that a frequent criticism of C++ is its spartan standard lib, and how the selling point of some commercial software projects such as Matlab is that, unlike C++, linear algebra work is trivial.
Except the devil is in the details as usual, the way linalg is specified doesn't guarantee numeric stability across library implementations or compilers.
Just like the std::random mess, most people are in for a surprise when they attempt to write portable numeric code with it.
> I don't agree at all. For most, linear algebra is the primary reason they pick up C++.
Out of hundreds of hundreds of projects I've interacted with, maybe less than 1% have used linear algebra in any non-basic capacity (e.g. more than multiplying two 4x4 matrices) and had to use Eigen or BLAS
You sound like you subscribe to "Orthodox C++".
Speaking seriously, I agree there's definitely a lot of bloat in the new C++ standards. E.g. I'm not a fan of the C++26 linalg stuff. But most performance-focused trading firms still use the latest standard with the latest compiler. Just a small example of new C++ features that are used every day in those firms:
Smart pointers (C++11), Constexpr and consteval (all improvements since C++11), Concepts (C++20), Spans (C++20), Optional (C++17), String views (C++17)
> I'm not a fan of the C++26 linalg stuff.
I don't agree at all. For most, linear algebra is the primary reason they pick up C++. Up until now, the best option C++ newbies had was to go through arcane processes to onboard a high performance BLAS implementation which then requires even more arcane steps such as tuning.
With C++26, anyone can simply jump into implementing algorithms.
If anything, BLAS support was conspicuously missing from C++ (and also C).
This blend of comments is more perplexing given that a frequent criticism of C++ is its spartan standard lib, and how the selling point of some commercial software projects such as Matlab is that, unlike C++, linear algebra work is trivial.
Except the devil is in the details as usual, the way linalg is specified doesn't guarantee numeric stability across library implementations or compilers.
Just like the std::random mess, most people are in for a surprise when they attempt to write portable numeric code with it.
2 replies →
> I don't agree at all. For most, linear algebra is the primary reason they pick up C++.
Out of hundreds of hundreds of projects I've interacted with, maybe less than 1% have used linear algebra in any non-basic capacity (e.g. more than multiplying two 4x4 matrices) and had to use Eigen or BLAS
1 reply →
> This blend of comments is more perplexing given that a frequent criticism of C++ is its spartan standard lib
The frequency doesn't make the criticism more valid and those repeating it would be better served to let go of their fear of third-party libraries.
2 replies →