lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


On Nov 17, 2014, at 4:42 PM, Roberto Ierusalimschy <roberto@inf.puc-rio.br> wrote:

>> I kinda-sorta get that, but if you follow your logic then I don’t
>> really see them as “sub-types” but fully distinguished types.
> 
> I do not see why. What is the difference between these types? The value
> is what is important, not the type. If you have a number with a value
> of 1, it does not matter whether it is an integer or a float (in almost
> all contexts). If you have a number with a value of 1.4, it also does
> not matter whether it is an integer or a float :)

This is my basic logic for UTF-8-correctness too. I don't care about the type "string"; all bounded sequences of chars exist. What is important to me is that UTF-8-operations will not produce nonsensical results. 

From the point of view of a number pyramid weenie, there's a distinction between (exact 1.0) and (inexact 1.0). You can take some circuitous route of inexact calculations and end up with the IEEE bitpattern for 1.0, but it can't be exactly equal to a precise 1.0.

This is not the point of view of vernacular C, which does not have a fancy numerical analysis model (where perhaps floats represent a range). As you and lhf described, Lua tends to conform to the shape of its C API. There is little C language distinction between 1.0 and 1, certainly not in modern C systems.

Jay