← Back to context

Comment by bjourne

1 day ago

There are many low-level devices where initialization is very expensive. It may mean that you need two passes through memory instead of one, making whatever code you are running twice as slow.

I would argue that these cases are pretty rare, and you could always get nominal performance with the __noinit hint, but I think this would seldomly even be needed.

If you have instances of zero-initialized structs where you set individual fields after the initialization, all modern compiler will elide the dead stores in the the typical cases already anyway, and data of relevant size that is supposed to stay uninitialized for long is rare and a bit of an anti-pattern in my opinion anyway.

meh, the compiler can almost always eliminate the spurious default initialization because it can prove that first use is the variable being set by the real initialization. The only time the redundant initialization will be emitted by an optimizing compiler is when it can't prove its redundant.

I think the better reason to not default initialize as a part of the language syntax is that it hides bugs.

If the developers intent is that the correct initial state is 0 they should just explicitly initialize to zero. If they haven't, then they must intend that the correct initial state is the dynamic one in their code and the compiler silently slipping in a 0 in cases the programmer overlooked is a missed opportunity to detect a bug due to the programmer under-specifying the program.

  • In recent years I've come to rely on this non-initialization idiom. Both because as code paths change the compiler can warn for simple cases, and because running tests under Valgrind catches it.

  • It only works for simple variables where initialisation to 0 is counter productive because you lose a useful compiler warning (about using initialised variable).

    The main case is about arrays. Here it's often impossible to prove some part of it is used before initialisation. There is no warning. It becomes a tradeoff: potentially costly initialisation (arrays can be very big) or potentially using random values other than 0.

    • Fair point though compilers could presumably do much better warning there on arrays-- at least treating the whole array like a single variable and warning when it knows you've read it without ever reading for it.