Comment by midnightclubbed

6 months ago

I have used auto liberally for 8+ years; maybe I'm accustomed to reading code containing it but I really can't think of it being a problem. I feel like auto increases readability, the only thing I dislike is that they didnt make it a reference by default.

Where do you see difficult to track down performance/memory implications? Lambda comes to mind and maybe coroutines (yet to use them but guessing there may be some memory allocations under the hood). I like that I can breakpoint my C++ code and look at the disassembly if I am concerned that the compiler did something other than expected.

I just wish they hadn't repurposed the old "auto" keyword from C and had used a new keyword like "var" or "let".

   #define var auto
   #define let auto

  • Given how important backwards compatibility is for C++, it's either take over a basically unused keyword or come up with something so weird that would never appear in existing code.

    Java solved this by making var a reserved type, not a keyword, but I don't know if that's feasible for C++.

E.g. `std::ranges::for_each`, where lambda captures a bunch of variables by reference. Like I would hope the compiler optimizes this to be the same as a regular loop. But can I be certain, when compared to a good old for loop?

  • To be fair std::ranges seems like the biggest mistake the committee allowed into the language recently.

    Effectively other than for rewriting older iterators based algorithms to using new ranges iterators I just don't use std::ranges... Likely the compiler cannot optimise it as well (yet) and all the edge cases are not workes out yet. I also find it to be quite difficult to reason about vs older iterator based algorithm's.

    for each would take a lambda and call the lambda for each iterator pair, if the compiler can optimise it it becomes a loop, if it can't it becomes a function call in a loop which probably isn't much worse... If for some reason the lambda needs to allocate per iteration it's going to be a performance nightmare.

    Would it really be much harder to take that lambda, move it to a templated function that takes an iterator and call it the old fashioned way?

    • Yeah, the std::ranges implementation is a bit of a mess. The inability to start clean without regard for backward compatibility reasons limits what is possible. I think most people see how you could implement comparable functionality with nicer properties from a clean sheet of paper. It is the curse of being an old language.

      1 reply →

  • Just ban ranges lib, it is hot garbage anyway. The compilers are able to optimize lambdas fairly well nowadays(when inlined), I wouldn't be that concerned.