lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]

2013/9/8 Tim Hill <>:

> Or, to flip the question around the other way, why DOES tostring() use
> %.14g? Was there some issue that caused a problem if it was more than 14?
> Better compatibility across platforms? Or just an arbitrary choice?

Can't speak for the Lua team, of course, but one does that sort of thing
in order to avoid spurious printed digits. According to Henrici's statistical
model of roundoff propagation, your error in the last digit is roughly
the square root of the number of arithmetic operations you made.
So if all your operations are made in 16-digit arithmetic, starting from
correctly rounded values, you expect to have 14 digits correct on
results involving fewer than 10000 operations, whereas to get 15
digits correct, you would need fewer than 100 operations. Roughly
speaking in both cases of course. To decide that 100 operations could
easily arise in practical Lua applications but 10000 seldom, is a value
judgment, true, but not an arbitrary choice.

Do we really want "tostring(0.1)" to return "0.10000000000000001"?
No? Then %.14g is not too unreasonable.