[Date Prev][Date Next][Thread Prev][Thread Next]
- Subject: Re: [ANN] Lua 5.2.0 (work1) now available
- From: steve donovan <steve.j.donovan@...>
- Date: Sun, 10 Jan 2010 19:09:04 +0200
On Sun, Jan 10, 2010 at 4:05 AM, Luiz Henrique de Figueiredo
> Exactly. The token-filter patch was a proof-of-concept thing.
So the question now is, how can token filtering be moved beyond
proof-of-concept? The basic requirements would be that (1) there
should be minimal impact on compile speed if filters are not used
("don't pay for what you don't use") and (2) independent filters can
be installed via a public API. Finally, (3) the operation of filters
can be activated on context, so that for instance an application can
choose to only apply a filter to code dynamically compiled using
One model is filter chaining, where each filter is invoked in turn
upon the token stream, potentially each modifying it. (Of course, one
should not assume a particular filter ordering)
A useful design principle for such filters is then that they should
always return syntactically valid Lua token streams, so that they can
be combined safely.
Another higher-level model is that filters provide functions that act
on patterns, such as token values or sets of token values.
But the main question is, would this be a useful exercise? I intend
continuing to develop LuaMacro (for instance) but it would be good to
have a firmer framework to build upon.