lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


> 5.  Applications whose job is text processing typically are easier
> working with internal arrays of characters rather than UTF-8
> (but they should still read and write UTF-8 externally!).
> The exact details of which data type you use to hold your
> character values is up to your application.  16-bit integer (if you
> don't care about the new Unicode points), 32-bit integer,
> and even double-precision floating point (if you use Lua)
> are all perfectly fine, with 16-bit being perhaps somewhat
> less than ideal (now that Unicode has bloated some)
> but still more efficient.

I've ported applications that use Lua to platforms with no 64-bit
floating point, so I don't like depending on lua_Number being double.
Fortunately, the Lua use wasn't very pervasive in the program at
the time and the problems were easily worked around.  I'll probably
change lua_Number to float even on x86 ports in the future, to catch
"storing large integers in floats" problems quickly.  (I think it
also makes stack alignment better.)

I certainly didn't mean to suggest that using doubles was a first choice!
But if using arrays of lua_Number would make it easier to interface with
Lua and if lua_Number happened to be double on your system,
then I think it would be fine (albeit surprising) to represent Unicode
code points as doubles.

Certainly that wouldn't *depend* on lua_Number being double, since
Unicode values easily fit in 32-bit numbers for a while yet.

Russ