lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


On 04/28/2015 02:57 PM, Chris Emerson wrote:
On Fri, Apr 24, 2015 at 05:44:47PM -0400, Sean Conner wrote:
Makes sense when you think about it.  Even more interesting is this bit of
code:
[converting 2^33 to 32-bit int/unsigned int
(Again, assumption is sizeof(int) == sizeof(float).)

The results here have me perplexed because the output is not what I
expected:

	8589934592.000000 -2147483648 0

Truthfully, I'm not sure what I expected, but two wildly different answers
isn't it; perhaps INT_MAX and UINT_MAX but not INT_MIN and 0! 2^31 gave the
expected answers, and 2^32 gave -2147483648 0.
In C, converting a float type to an integer is undefined behaviour if the
value can't be represented (after truncating fractional bits).  So any
result or behaviour is "correct".  :-)

Chris

'Behavior' being the key word here. We got a nice trap at this conversion on the P2020 processor.
--
Thomas