Comment by seanbax

1 year ago

From P3465: "why this is a scalable compile-time solution, because it requires only function-local analysis"

From P1179: "This paper ... shows how to efficiently diagnose many common cases of dangling (use-after-free) in C++ code, using only local analysis to report them as deterministic readable errors at compile time."

Local analysis only. It's not looking in function definitions.

Whole program analysis is extremely complicated and costly to compute. It's not comparable to return type deduction or something like that.

Whole program analysis is also impossible in the common case of calling functions given only their declarations. The compiler sees the standard library and the source files it is compiling, not arbitrary external libraries to be linked at a later stage: they might not exist yet and, in case of dynamic linking, they could be replaced while the program is running.

Making programmers manually annoate every single function is infinitely more costly.

  • That rather depends. Compile time certainly wouldn't scale linearly with the size of a function, you could well reach a scenario where adding in a line to a function results in a year being added to the compile time.

  • Are you also a proponent of nonlocal type inference? Do you think annotating types is too costly for programmers?