[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Re: LPEG and token lists
- From: Michal Kottman <k0mpjut0r@...>
- Date: Tue, 16 Mar 2010 10:21:27 +0100
On Po, 2010-03-15 at 21:55 -0700, Wesley Smith wrote:
> Let's say I have a descriptions of the tokens a lexer generates and a
> grammar written in terms of the tokens. What I'd like to do is write
> some simple LPEG patterns to generate the tokens and then write the
> grammar in terms of those tokens without having to convert the token
> list into a giant string.
> ...
> Is there some way to seduce LPEG into doing something like this as is
> or am I out of luck?
Hi, I'm not sure if this is what you are looking for, but I have just
found this: http://github.com/mascarenhas/lpeg-list "Support for
matching lists with LPEG".
Taken from the README file:
p = re.compile([[
exp <- { "add", <exp>, <exp> } -> add
/ { "sub", <exp>, <exp> } -> sub
/ { "mul", <exp>, <exp> } -> mul
/ { "div", <exp>, <exp> } -> div
/ <.>
]], { add = function (x, y) return x + y end,
sub = function (x, y) return x - y end,
mul = function (x, y) return x * y end,
div = function (x, y) return x / y end, })
assert(p:match{{ "add", { "div", 8, 2 }, 3 }} == 7)
And from the paper:
"Extending PEGs to match structured data make PEGs useful for a larger part of the
compilation pipeline. A PEG-based scannerless parser can construct an abstract syntax
tree..."
Looks a lot like what you want to achieve. I hope it is what you are
looking for.
Michal