← Back to context

Comment by foobar12345quux

6 months ago

Hi Rich, using ptrdiff_t is (alas) the right thing to do: pointer subtraction returns that type, and if the result doesn't fit, you get UB. And ptrdiff_t is a signed type.

Assume you successfuly allocate an array "arr" with "sz" elements, where "sz" is of type "size_t". Then "arr + sz" is a valid expression (meaning the same as "&arr[sz]"), because it's OK to compute a pointer one past the last element of an array (but not to dereference it). Next you might be tempted to write "arr + sz - arr" (meaning the same as "&arr[sz] - &arr[0]"), and expect it to produce "sz", because it is valid to compute the element offset difference between two "pointers into an array or one past it". However, that difference is always signed, and if "sz" does not fit into "ptrdiff_t", you get UB from the pointer subtraction.

Given that the C standard (or even POSIX, AIUI) don't relate ptrdiff_t and size_t to each other, we need to restrict array element counts, before allocation, with two limits:

- nelem <= (size_t)-1 / sizeof(element_type)

- nelem <= PTRDIFF_MAX

(I forget which standard header #defines PTRDIFF_MAX; surpisingly, it is not <limits.h>.)

In general, neither condition implies the other. However, once you have enforced both, you can store the element count as either "size_t" or "ptrdiff_t".