[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Re: undefined conversion for numeric indexes
- From: David Jones <djones@...>
- Date: Tue, 18 Jun 2002 14:59:03 +0000
In message <200206181402.LAA04323@tecgraf.puc-rio.br>, Luiz Henrique de Figueir
edo writes:
> >In my lua 4.0 the following code crashes (with a floating point exception):
> >
> >a={}
> >a[2^31+1]=1
>
> What platform? It works ok in our Linux machines.
On a platform that raises a floating point exception when trying
to convert a floating point number that is larger than INT_MAX +
1 to an integer type. Linux typically doesn't have any floating
point exceptions enabled by default does it? Anyway, I think trying
to find out exactly which platforms it crashes on is a sideshow,
the fact is that the converting a float f to an int is undefined
when f is >= INT_MAX + 1.
> Have you tried 5.0w0?
I'm afraid I have not; I'm not keen on using anything other than the
official releases.
Cheers,
drj