On 11/25/18, Philippe Verdy <verdy_p@wanadoo.fr> wrote:
> Note: math.floor() is not required toi return a native binary integer. It
> may return a variant type using integers for some ranges, and doubles
> outside the range. So even in this case the the inference would not help
> eliminate the necessary tests on the effective type of the variant... The
math.floor can return literally anything. It can be redefined
Here I was speaking about the standard library function, without any redefinition: it takes a 'number' (usually a double when compiling Lua itself) and returns a 'number' which is not limited to a 53-bit integer).
So the function computing the Julian day MUST still contain a code testing the value to make sure it is in range. Technically the given sample function was NOT correct as it returns random value if it is not. But it will be correct if "number" is a 64-bit double, and the function first check that the parameter is within +/-2^31 before performing all the rest in the "if" branch (the else branch may just return nil to indicate the error or call error()); then the compiler know that the given number is not a NaN and not infinite; and the standard function math.floor (the compiler can know that it is not overloaded) is warrantied to return a 32-bit integer. Then it can perform optimization of the arithmetic code that follows (it still needs to check that arthmetic operators are also not overloaded).
Operators can be redefined too ! The code "1+1" in Lua is not even warrantied to return 2 if "+" is redefined (in function contexts where metatable contain an overload).
In summary the compiler needs to check the environment (it is easy to generate two compiled codes version depending if the environment uses the standard libary or not, it's not so complex because it is determined only by the standard Lua language semantics), before doing any kind of type inference (which is more complex: tracing the possible value range is much more complex as it depends on preevaluating all possible limits!)
A simple check like "if (d>>0 == d) { ... } else return nil" within the Lua functio nto compiler would do the trick where the compiler can determine that d is effectively using an integer in correct range for integers. but the actual test **in Lua** is a bit more complex because "local a = jd + 32044" can overflow in 32-bit integers, just like the subexpression "(4 * a + 3)" used in the next statement of the sample: a 32-bit check is not enough to ensure the result would be correct : the result of the function is actually correct if "jd" is in range of 29-bit integers so the initial test should be
"if (d>>3<<3 == d) { ... } else return nil" ...
So the compiler would need to track all constant and variable value ranges to make inference. Type inference is not enough !
In the case where it _isn't_, this is pointless pedantry since it's
not unreasonable to expect the compiler to be aware of (idioms
involving) the stdlib. (And vice versa.)
So you're creating another language which is not Lua, if you remove parts of its core features. The compiler you're creating is not a compiler for Lua...
> Note that the ompiler must still be correct semantically (notably for
> handling exceptions/errors and pcall() correctly).
Pray tell, what does this entail under the pretext of _changing the
semantics_ to make things easier for the compiler?
Changing the semantics of the language means you're creating a compiler for another language. Lua cannot work at all without exception/error handling