lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


>>> - Avoid implicit "multi-step" type conversions (e.g., float ->
>>> unsigned char).
>> What's multi-step about that?  [...]
> The example I gave with floats was not a good one, sorry.  A concrete
> example we had in the past: (signed)char -> unsigned int.

> We can think like this:

>   (signed char)(-3)  ->  (int)(-3)  ->  (unsigned int)4294967293
>   (signed char)(-3)  ->  (unsigned char)253  ->  (unsigned int)253

> The first choice is the correct one, but some compilers did the
> second.

I note you put "signed" in parens (I'm talking about the example text,
not the two conversion lines).  If this was plain char, not
explicitly-signed char, could it maybe have been a case of a system
where plain char was unsigned?  Plain char is always just like signed
char or unsigned char, but it is implementation-dependent which one,
and if it's unsigned the latter conversion _is_ the correct one, though
in that case it would be more accurately written

    (char)(-3) -> (char)253 -> (unsigned int)253

Not everyone realizes that "char" can be either signed or unsigned
depending on the implementation, and your writing "(signed)char" makes
it look as though you might have meant "char, which, since there was no
unsigned modifier, means signed char".

> Just in case, it is safer to explicitly cast to int before casting to
> unsigned int.

If an explicit intermediate cast to int changed anything (or if it was
explicitly-signed char), then, yes, it was a compiler bug.  At least
from a C99 perspective; it could also have been just a compiler that
was using old rules for converting between signed types and unsigned
types - I'm not sure what the old rules were, but I know they were
different.

/~\ The ASCII				  Mouse
\ / Ribbon Campaign
 X  Against HTML		mouse@rodents-montreal.org
/ \ Email!	     7D C8 61 52 5D E7 2D 39  4E F1 31 3E E8 B3 27 4B