[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: token filters for Lua 5.1
- From: Luiz Henrique de Figueiredo <lhf@...>
- Date: Tue, 12 Sep 2006 13:55:10 -0300
I've cleaned up the code for the token filter hack. Get it at
http://www.tecgraf.puc-rio.br/~lhf/ftp/lua/#tokenf
I've only included two simple examples, but not Daniel's macro stuff.
I'll work more on that, but I wanted to release it "officially" so that
we can have a common basis for work and discussion. See the attached README.
The nice thing about this hack is that it depends only on a two-line
change in llex.c:
diff /tmp/lhf/lua-5.1/src/llex.c llex.c
138c138
< void luaX_setinput (lua_State *L, LexState *ls, ZIO *z, TString *source) {
---
> static void setinput (lua_State *L, LexState *ls, ZIO *z, TString *source) {
444a445
> #include "proxy.c"
Of course, the good stuff is in proxy.c.
To add token filtering to your application, simply compile the hacked llex.c
and replace llex.o in your Lua library with the new one.
Enjoy! All feedback is welcome.
--lhf
This is a token filter facility for Lua 5.1.
The motivation for this piece of code is that many requests for new
syntactic constructs can be easily coded by simple token manipulation.
For more detail, see the presentations at the Lua Workshops in 2005 and
2006 by Luiz Henrique de Figueiredo and Daniel Silverstone.
The token filter works by giving you the opportunity to inspect and
alter the stream of tokens coming from the lexer before they go to the
parser. You get to see only tokens and you are only allowed to generate
tokens -- you're not allowed to see the text coming to the lexer nor to
generate text to go into the lexer. Luckily, the lexer does not raise
errors on single-char tokens that it does not understand and so you
can use those for your own purposes, but you're still only allowed to
generate good Lua tokens.
To filter tokens, define a global function called FILTER. This function
will be called once before the parsing begins and then each time the
parser needs a token. The first time FILTER is called, it receives a
function to be used to get raw tokens. You should arrange to use this
function during the actual filtering. (So, in practice you redefine
FILTER the first time it is called.) One caveat is that you should not
try to get more tokens after the parse has finished (either sucessfully
or with an error) or you'll probably get a crash.
That's it. You're on your own on how to filter tokens. I've found that
coroutines are a good tool for implementing sequences of tokens filters.
See examples in f*.lua, which affect the compilation of test.lua.
(test.lua is not a valid Lua program unless processed via fnext.lua.)
Try make FILTER=fnext