This does not contradict at all what I said. But the initial comment was not clear enough (speaking about just converting a number to an integer, whithout specifying how the numeric value was converted).
All what is said in Lua about the distinction of floats and integers is absolutely not necessary : it's just an internal implementation detail, the internal datatype used, or "subtype" does not matter, Lua like also _javascript_, should only handle "numbers", which is the only datatype at interface level.
And it should also be true of the LuaC internal interface: it should be able to use Lua "numbers" in all cases, and should clearly expose a special API entries with limitations for the special optimization case; the same is true as well for JNI in Java that needs to expose an unrestricted interface and a restricted interface for the "optimized" case; as this also applies to the representation of strings as unrestricted arrays of unsigned 16-bit values and not restricted to UTF-16, or as 8-bit strings using a "modified UTF-8" encoding (needed for the representation of arbitrary 16-bit code units in JNI and in the internal representation of classes compiled to bytecode, because arebitrary 16-bit code units are forbidden in standard UTF-16 and then not representable in standard UTF-8: the "modified UTF-8" of Java/JNI also uses 6 bytes instead of 4 bytes for code points in supplementary planes, because it does not restrict the surrogates to be paired correctly in standard UTF-16 and because JNI actually represents the 16-bit codes units and not really the codepoints).
What was then discussed since long for Java also applies to other languages (including C/C++, C#, _javascript_, Lua...) whose datatypes are less restricted than the "standard" subtypes they can be used for and specially optimized for: optimizations should still take into account the "special" values that do not follow the restrictions of the "standard" subtypes: it is important to identify those cases (and note that for floatting points, Lua is supposed to represent NaN values with a single value, but actually the IEEE floats/doubles have multiple values for NaNs, some of them signaling or not, and with additional bits in the mantissa; some other details includes the "denormal" values, with the lowest negatiove exponent-value but a limited precision, and the special cases of "signed zeroes" for which Lua is not clear enough about if they should be unified to a single unsigned zero value of the "number" Lua type). Lua is also not clear about the exposed precision of numbers and how Lua numbers are converted from/to native floats/doubles (that have multiple possible representations, and not just those of the IEEE standard, or those part of the float/double/long double datatypes in C/C++ (which is also a separate complex issue as it also requires a special conversion mechanism between them and the hardware-native types used by an FPU or GPU or some other external device, or in the "vector" computing units integrated in the CPU or in a separate device, which may be in the same host or accessed virtually via some networking conversion interface)
Lua is also not clear about the representation of its "nil" value even if it says it is in a separate datatype: is it really unique with a single instance or does it depend on the representation of "NULL" pointers in C/C++? I'm not even convinced that "nil" in Lua is represented by a NULL pointer in C/C++, it may be a valid non-NULL pointer to an object, but then there's the need to convert NULL pointers used in the internal C/C++ implementation to a valid "nil" object reference in Lua.
The basic datatypes in Lua are clearly insufficiently specified (the same is anso true for C/C++!), we don't know clearly their limits and Lua (unlike Java) exposes no clear way to allow a program to know these limits and adopt a predictable behavior. This makes Lua programs still difficult to assert that they are really "portable" across platforms/implementations, and that Lua implementations are really "conforming". This is a very fuzzy area that shows that Lua is still a language in development. Note that some Java programmers did not like these restrictions (and this caused conflicts between them), and the same applies too to _javascript_/ECMAscript: various programmers wanty to sacrifice portability to allow increase of performance (and this is what has happened to C/C++ and what made them the most unsafe programming languages, with critical security bugs that are now so critical and causes a huge growing cost to final users: a nightmare for the world and for our public and personal freedom and security).
Micro-optimizations do not pay at all, there's nothing to win (except for a limited short time), but all to loose. We must always be strict on datatype specifications, even if this causes some minor performance penalty (but this small penalty is compensated rapidly without doing enything else, in just a few months or even less, by the rapid evolution of technologies and their rapid reduction of cost for using them).