Thank you for your comments.
Regarding automatic coercion between strings and numbers being a bad (‘not
good’) thing, I’d certainly be in the minority group then, among those who
strongly disagree with this, and for me this is just one of the many reasons to
like Lua.
My reasoning is this:
From a human point of view numbers are (have always been) nothing more than
strings of digits, just like words are strings of letters. Think how you
write them on paper, or when you type.
Only for computing convenience (based on today’s
technologies) we have made a ‘special arrangement’ for numbers to be treated
differently internally. But, in an ideal world, a computing language
should hide whatever machine requirements from the user as much as
possible/feasible. There is nothing wrong with thinking of numbers as
strings that just *happen* to contain mostly digits. So, in my view
deprecating this ‘bad thing’ would be an utter mistake. Lua is already in
the right direction to the ideal, why go back?