← Back to context

Comment by yaakov34

12 years ago

I'd disagree about C++. In my experience, the only things it adds is (1) a false sense of security (since the compiler will flag so many things which are not really big problems, but will happily ignore most overrun issues), (2) lots of complicated ways to screw up, such as not properly allocating/deleting things deep in some templated structure, and (3) interference with checking tools - I got way more false positives from Valgrind in C++ code than in C.

I wish godspeed to Rust and any other language which doesn't expose the raw underlying computer the way C/C++ does, which is IMO insane for application programming.

I'd disagree with your disagreement ;-) C++ has constructs that let you build safer and more secure systems while maintaining good performance. You can write high performance systems without ever touching a raw pointer or doing manual memory management which is something that you can't really do in any other language. Yes, you need to trust the underlying library code which is something you have to do for pretty much any other language.

In my experience well written C++ has a lot less security issues vs. similar C code. We've had third party security companies audit our code base so my statement has some anecdotal support in real world systems.

  • I second this point. The keyword here is "modern" C++, which encourages people to write shared-memory symantics, and to create strategies that make it impossible to screw up.

    "New" is a thing of the past, along with the need to even see T *.

    These days good code in C++ is so high-level, one almost never sees a pointer, much less an arbitrarily-sized one. This is of course unless you're dealing with an ancient library.

    Another thing of importance:

    If you're working with collection iteration correctly (which tends to be the basis for a lot of these out of bounds errors), there is no contesting the beginning offset, or the end offset - much less evaluating whether that end is in sight. Even comparisons missing stronger types showing "this thing is null terminated vs that is not" can be eliminated if you just create enough definitions for the type-system to defend your own interests. If you're coding defensively, these shouldn't even be on the menu. One buffer either fits in the other for two compatible types, or you simply don't try it at all.

    It'd be amazing what standard algorithms can leave to the pages of history, if they were actually put to good use.

    Python has some nice high-level concepts with their "views" on algorithmic data streams that show where modern C++ is headed with respect to bounds checking, minus the exceptions of course :)

    • But the problem is that there are people who have been coding C++ for 20 years, who never quite got all of these newfangled smart pointer things and just want the "C with classes" parts of C++.

      Or you have the cargo cult programmers, who don't know the language very well, so just pick up idioms from random internet postings or from some old part of the codebase that's been working for a while, so it must be a good source of design tips.

      Remember, any time you talk about the safety of a language, you have to think about its safety in the hands of a mediocre programmer who's in a hurry to meet a deadline. Because that's where the bulk of the problems slip in; not when people are actively and deliberately coding defensively.

  • Does anyone have experience with (auditing) systems built using pascal/delphi? I realize that Ada might be a better choice if headed in that direction -- but it always felt to me like pascal actually had a pretty sane trade-off between high and low level, and that C generally gave a lot of the bad parts of assembly while still managing to obfuscate the code without really giving that much in terms of higher level structure.

    • Pascal gets a bad rap, but that's due to the limitations of the original "pure" language. Something like Turbo Pascal which took features from Modula-2 is actually be a very good alternative to C for systems programming.

      - Stricter type checking than C, (e.g. an error to add a centigrade value to a Fahrenheit value without casting)

      - Bounds checked arrays and strings

      Turbo Pascal added:

      - Selectively turn off bounds checking for performance

      - Inline assembler

      - Pointers and arbitrary memory allocation

      I don't think there's anything you can do in C that you can't do in TP. For example, it was easy to write code that hung off the timer or keyboard interrupts in MSDOS, which is pretty low level stuff.

      The important thing is that the safe behaviour should be the default, you have to mark unsafe areas of code explicitly. This is the opposite to how it works with C.

To be clear, Rust _does_ expose the raw underlying computer the way C/C++ does, it's just off by default rather than on.

>(2) lots of complicated ways to screw up, such as not properly allocating/deleting things deep in some templated structure

Wow, that sounds scary. Do you have any references or further reading about this?

  • I don't have a reference, but I've seen this myself. Problems can happen whenever pointers combined with the STL or other modern C++ stuff. In the thread from several years ago, I gave as an example pushing a pointer to a local variable into a vector which is then returned somewhere outside of scope. Compilers don't warn about this, or at least didn't then, although Valgrind catches it. And of course this can be a more complicated case, like a vector of maps from strings to structures containing pointers which is passed by reference somewhere - which will make it harder to catch. And Valgrind won't help if it doesn't see it happening in the execution path that you ran.

    Now, combining pointers and STL is not a good idea. In fact, using raw pointers in C++ is not a good idea, at least IMO (but you've seen I am a bit concerned about memory safety). However, this is perfectly supported by compilers, and not even seriously discouraged (some guides tell you to be careful out there). I've seen difficult bugs produced by this, in my case, in a complicated singleton object.

    • > In fact, using raw pointers in C++ is not a good idea, at least IMO

      Not just your opinion; it's become the "standard of practice" in the C++ development community.

      They're trying to get it that between STL, make_shared, C++14's make_unique, etc., that you won't actually be using "naked new"s in any but the rarest cases. For the rest of memory management you'd use types to describe the ownership semantics and let the compiler handle the rest.