>>> However the correct space character is 0x20 (32).
This is what I am telling.. What? Who said that 0x20 is the correct space character? Answer: ASCII
But in Unicode, we have more than 1 "correct space character", because it is Unicode, not ASCII... So, current LUA version does not support unicode characters.
By miracle, if you do not use the "wrong" unicode characters, LUA accept it, because UNICODE was made to be backward compatible with ASCII till some point
Note: Using the public unicode character database it's easy to handle all white space characters of unicode.