lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


On 07/07/2018 18:18, Axel Kittenberger wrote:

[snip]

My guess is you are severely underestimating the complexity of unicode and
think of it ASCII with extras. It's not. There are so many (strange)
featuers in unicode when used in general programming would go haywire.
Unicode is designed as a typesetting tool, not as a programming tool.

[snip]

This is spot-on : "Unicode is designed as a typesetting tool, not as a programming tool."

I was going to post a message saying the exact same thing.

I remember when I was learning programming for the first time in my life (self-learning, with the Italian reference manual of a 2nd hand TI99/4a home computer): they stressed out the importance to avoid using visually-ambiguous identifiers, because some characters could be mistaken (i.e. '1' with 'l', 'o' with 'O' and with '0').

Unicode in programs outside strings would make bugs skyrocket: how many code points are visually similar to a, let's say, a zero '0' or a lowercase hex 'x'? Not counting the fact that changing the font of your editor will change the aspect of *all* those "zero-like" "characters". Nightmarish! For no real benefit.