lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


David Jones wrote:
[...]
> I think you're right; this is the only safe way.  I'm amazed that this
> has gone unnoticed.  Of course, even with this solution it's _possible_
> to construct a perverse C implementation where it would be undefined
> behavious (if INT_MAX was huge enough to be outside the range of double).

Actually, talking to someone who knows more about floating-point
representations than I do: the Evil Hack to convert a double to an int, viz.:

#define lua_number2int(i,d) \
  { volatile union luai_Cast u; u.l_d = (d) + 6755399441055744.0; (i) = u.l_l; }

...should produce the 'right' results --- i.e. (double)-1 == (int)-1 ==
(unsigned int)0xFFFFFFFF --- everywhere where the hack does work, because the
hack relies on a particular binary representation. It's only when you do
things correctly that it will start to fail. You're on a Mac, right, Brian? If
so, you won't be using the hack.

Actually, I think that the better way to do things is to allow lua_tointeger()
to fail if the conversion won't work; that is, if the value is outside
INT_MIN..INT_MAX or not a valid number (such as NaN or +Inf or -Inf). It's a
pity that it's too late to change this now, and it'd probably be too slow
anyway...

-- 
╭─┈David Given┈──McQ─╮ "A line dancer near a graduated cylinder, the
│┈ dg@cowlark.com┈┈┈┈│ blithe spirit inside some wheelbarrow, and a tomato
│┈(dg@tao-group.com)┈│ are what made America great!" --- received via spam
╰─┈www.cowlark.com┈──╯

Attachment: signature.asc
Description: OpenPGP digital signature