lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


On May 1, 2013, at 10:00 AM, Roberto Ierusalimschy wrote:

> I already mentioned one: pointer difference and ptrdiff_t. According to
> the standard, they are mostly useless:
> 
>  When two pointers are subtracted, both shall point to elements of
>  the same array object, or one past the last element of the array
>  object; the result is the difference of the subscripts of the two
>  array elements. The size of the result is implementation-defined,
>  and its type (a signed integer type) is ptrdiff_t defined in the
>  <stddef.h> header.  IF THE RESULT IS NOT REPRESENTABLE IN AN OBJECT OF
>  THAT TYPE, THE BEHAVIOR IS UNDEFINED.
> 
> (emphasis added.) Nothing in the standard connects the minimum size of
> ptrdiff_t with size_t. So, an arbitrary subtraction of pointers to
> an array larger than 64K can fail.  

Arrrgh. I did not draw that connection. I tend to carry around both p and i for p[i] so I never thought very much about ptrdiff_t.

To indulge in a little standards-archeology, I think there was a chain of decisions, each making sense in isolation. 

K&R C defined pointer arithmetic in terms of ints. By the time ANSI C came around, machines with 16-bit ints but bigger memory (I'm looking at you, 8086) were in vogue, and there was a clear need to standardize allocation of at *least* arrays sizeof(something_t[n]) > MAX_INT for the "large" model. Not everybody agreed that sizeof(char[n]) > MAX_INT was necessary to support. What sense would a string longer than 64k (uh, 32k?) make? Nobody uses strings that long, at least not on an 8086. 

So size_t was created to support compilation in the "large" model--nobody wanted to force sizeof(int)==4 just to be able to use the large model. Retaining natural ints does imply sizeof(FAR void*)>sizeof(int) but if you're dealing with addressing large memory you've already resigned yourself to that.

But I bet nobody really wanted to force promotion of pointer arithmetic to longs, since pointer arithmetic is the core idea of C; who knows what would break. (A lot of functions without prototypes, I bet.) So ptrdiff_t was created to *allow* programs and compilers to agree to widen pointer arithmetic, without forcing existing implementations to do so. Implementations could just typedef it to int and not change the compiler at all.

BTW, I'm being a little unfair to 8086; there are a lot of machines which couldn't represent a pointer in a natural int. A lot of 68000 compilers also had sizeof(int)=2 by default, but by the time I got there, they usually also had a large compilation model which did set sizeof(int)==sizeof(void*)==4. Plus you didn't have to deal with far vs huge pointers. I am so happy to have lived on the Amiga until flat i386.

None of this deals with the signed overflow problem, which as usual in C causes flying monkeys. I don't think many people cared about overflow until it became an attack vector.

Jay