lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]

Hello Mouse.

Mouse wrote in
 |>> No, LUA_MAXUNSIGNED is (-0x8000000000000000)
 |> Where did this definition come from?
 |Um, looking at that now it's probably wrong.  I typed that into the
 |email; it probably is the min signed, not the max unsigned.
 |I clearly should stop trying answer email when I'm rushed.
 |Here's the diff, copied into the email mechanically rather than
 |manually.  In particular, I find
 |#define LUA_MAXINTEGER                0x7fffffffffffffff
 |#define LUA_MININTEGER                (-0x8000000000000000)
 |#define LUA_MAXUNSIGNED               0xffffffffffffffff

How can that work without ull and ll suffix, it required
"__extension__ X ## ull" and such macros to get it done with gcc
2.95(.2?) for sure.  It really does??  Also the typedef required
__extension__ as such .. in 1999?  (I no longer have the initial
CVS, i once converted them to subversion, then mercurial iirc,
then git; somewhere in between...)

Note i personally never used such constants in preprocessor
statements, only the real compiler had to work with them.

P.S.: also i still have code which claims that "i do not trust
standard "u type suffix automatically scales"" (ie that 10u
automatically expands to necessary type), when i do not have my
special 64-bit-constant-creation-macro at hand, i do things like
"u64 const fact100 = S(u64,0x3B9ACA00u) * 10u" for example.
At some time, in one compiler version, it only worked like that.
(That the above works for you.. certainly only due to not using -W
-Wall -pedantic and such.)

P.P.S.: and that ll/ull is not true either for _MSC_VER where the
type is __int64.

|Der Kragenbaer,                The moon bear,
|der holt sich munter           he cheerfully and one by one
|einen nach dem anderen runter  wa.ks himself off
|(By Robert Gernhardt)