[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Re: Plea for the support of unicode escape sequences
- From: "J.Jørgen von Bargen" <jjvb.primus@...>
- Date: Fri, 01 Jul 2011 07:04:35 +0200
Am 01.07.2011 04:43, schrieb Tom N Harris:
"#define char int" but that would be cheating. Or not? I see a few
"sizeof(char)" in the Lua source. But it does assume that "char" has
not been redefined in other places. It does test against UCHAR_MAX but
with the assumption that it is less than 1000. (Only three decimal
digits or two hexadecimal.)
/Please/ dont. Then you'll end, where perl is: reading an 1MB file into
a string consumes 4MB of memory :-/ . One of the mayor reasons, I've
switched to Lua. I'm quite happy with Lua the way it is. A small utf8
module and you can do what you like to. Lua's "one char is one byte and
Lua doesnt care, if it's ascii or utf8 or utf16be or utf16le in the
bytes" is exactly what I want. In perl with every release you have to
struggle again, how to get unicode data to be handled.