lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


David Kastrup wrote:
Of course, LuaTeX is an extreme case, but I would not go as far as to
call it pathological.
luatex (and its developers) is quite happy with the current lua implementation, the cases where huge stings are read from file as a whole are seldom and even then, on my 5 year old laptop, reading a 30 meg file in 2 seconds is ok. On more resonable files it becomes neglectable esp given the things that need to be done with such content.
we use some fast iteratators for (multibyte) strings (handy for utf16/32 as well as utf collapsing; we've done quite some timings in that area

in luatex the majority of data processing involves tables (huge ones for fonts and such) and these can be cached (bytecode); when dealing with strings there are quite some string comparisons (when manipulating fontdata) and hashes strings work fine there; caching hash keys is one of the potential optimizations in that area (actually, tex itself also hashes a lot of its input, esp control sequences in macro code) we have done lots of timing and lua is not the bottleneck, only if we (de)allocate many small tables in which case the garbage collector can interfere (differes per platform btw) but that's a different story the lua-tex-lua interface will have optimizations but we stick within the existing lua concepts (if one reads up on the history of lua we're pretty sure that the authors know what they're doing -) Hans

-----------------------------------------------------------------
                                         Hans Hagen | PRAGMA ADE
             Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
    tel: 038 477 53 69 | fax: 038 477 53 74 | www.pragma-ade.com
                                            | www.pragma-pod.nl
-----------------------------------------------------------------