lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


On Thu, Oct 23, 2014 at 03:58:54PM -0400, Sean Conner wrote:
> It was thus said that the Great Roberto Ierusalimschy once stated:
> > 
> > > [...]
> > > lstrlib.c, line 1142.  Change:
> > >    buff[islittle ? i : size - 1 - i] = (n & MC);
> > > To:
> > >    buff[islittle ? i : size - 1 - i] = (char)(n & MC);
> > > Explanation:  Prevents compiler warning about possible loss of data.
> > 
> > This compiler seems quite dumb :-) How can (n & 0xFF) loose data??
> 
>   Being charitable [1], *technically* you are potentially losing
> information---values 128 to 255 may become -128 to -1, if chars are signed
> [2].

Yeah. It's not the bitwise operation, it's the assignment. The statement
invokes implementation-defined behavior because it's a narowing conversion
to a signed type. Microsoft defines the behavior of such demotions here:
http://msdn.microsoft.com/en-us/library/0eex498h.aspx

So at least for Visual Studio (2013) the expression is perfectly fine. And
it will likely be on almost any two's complement machine where char is
signed.

But IMO the code would be cleaner if buff was unsigned. Then the behavior
wouldn't be implementation-dependent. I always wished luaL_prepbuff would
return a void pointer so I wouldn't have to cast it to an unsigned char
pointer (I hate casts), but it's not the kind of thing worth breaking code
over.

>   -spc (Or it could be an utterly stupid compiler)
> 
> [1]	Like Microsoft needs any charity
> 
> [2]	C standard leaves the signness [3] of a bare 'char' declaration up
> 	to the implementation---it can be either signed or unsigned.
> [3]	Is that even a word?

I think you meant "signedness", which is the word used in the standards
documents.