lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


From memory on macOS (building with clang) I have always observed (and have assumed I can take as a given) that a regular division by zero gives me some sort of infinity while an integer division by zero causes an error.

Maybe it’s worth adding this is as a “semantic definition” for Lua arithmetic so that it is unambiguous? That leaves it up to the implementer (perhaps even by means of optional macros that are compiled away by default) to ensure that the semantics are followed.

That would allow builds made for systems like Bochs, Valgrind and so on to handle these cases consistently with real world builds. 


On 17 Jun 2019, at 14:39, Roberto Ierusalimschy <roberto@inf.puc-rio.br> wrote:

>> The well-defined semantics of division by zero are part of Annex F of the C
>> specification, which is an optional thing that compilers are not required
>> to implement. In the absence of Annex F, division by zero has undefined
>> behavior. C++ doesn't have anything like Annex F and division by zero is
>> stated to be undefined (although they mention in a non-normative comment
>> that most implementations allow the behavior to be controlled using library
>> functions).
>> 
>> A division by zero could result in a trap, exception, or signal such as
>> SIGFPE. The convention has been to move away from those behaviors because
>> it's convenient to use 1.0/0 as a way to get infinity, but unless Lua's
>> documentation specifically requires that you don't do that, nothing stops
>> you from passing a compiler flag or calling a library function to change
>> that.
> 
> That part I know. C is not required to implement IEEE arithmetic (as it
> is not required to use 2-complement integers). But repeating my original
> question, does anyone know of real platforms where this may cause real
> problems? Lua always has allowed division by 0 and we have never got any
> report about problems there.
> 
> -- Roberto
>