[Date Prev][Date Next][Thread Prev][Thread Next]
- Subject: Re: Fast loading of very large "symbol" tables
- From: Norman Ramsey <nr@...>
- Date: Sun, 12 Oct 2008 19:06:31 -0400
> I have need to provide a mapping of "names" to integer values on a
> reasonably large scale (hundreds of thousands). The name to number
> mapping is known at the compile time of my host program (C++). My basic
> naive approach is to generate a lua source file with the mapping in a
> table and load that script file into the host environment at run-time.
> This approach has a substantial time cost to it and seems like it would
> be more or less unnecessary since the data is static, at least for that
> particular version of the program.
A long time back I think lhf did something similar along the lines of
'self-executing Lua'. The idea is to compile to bytecodes then from
the bytecodes generate a C program that puts the bytecodes into
initialized read-only data. These days the bytecodes can actually be
larger than the original source, but you avoid the overhead of
compilation. However you still have run-time overhead.
The other alternative I see is to roll your own data structure as
static C code, define it as userdata, and define the __index method.
Obvious structures to try would be a so-called 'perfect hashing
function' (http://en.wikipedia.org/wiki/Perfect_hash_function) or the
Bentley/Sedgewick ternary search trees:
If your C programming skills are a bit rusty, the perfect hash
function is probably easier to set up as static data.
Using this method, the only dynamic overhead is creating an
initializing the metatable and associating it with the pointer to your