[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Re: [ANNOUNCE] lua_dolines-1.1.patch
- From: Juergen Fuhrmann <fuhrmann@...>
- Date: Wed, 31 Jul 2002 21:11:25 +0200 (MEST)
> Wed, 31 Jul 2002 19:31:45 +0200
> Edgar Toernig <froese@gmx.de> wrote:
>
> Luiz Henrique de Figueiredo wrote:
> >
> > In Lua 5.0, chunks are loaded via a single API function, lua_load, which
> > receives a user function that handles blocks of bytes to Lua's core for
> > parsing (or undumping).
>
> It would be nice if this function could be a Lua function. Or even
> better, a hook between lexer and parser...
Possibly, this just can be realized by lua_rawcall in the handler.
For myself I suggest to keep the option to have the possibility of
memory efficient parsing of documents containing mixed Lua and huge
datasets. Lua indeed is _speed_ efficient on this.
I tried to read 1000k doubles as a Lua table and then copied them to a
C array. Concerning speed, this process is amazingly fast.
Concerning memory, it is quite ugly, as you keep the data three times
times in memory: bytecode, Lua table, C Array.
lua_load possibly allows to have an efficient parser able to handle
meta tags (e.g. xml) and document chunks of mixed content between them.
I am quite happy to see that obviously, the Fathers of Lua are aware
of these issues and look for solutions. Thanks!
Juergen