lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]

Hi, here's a biased comparison.

Token filter based solutions (luasub and raw token filters) are probably simpler to learn in the initial stages, if you have no significant experience in tree-manipulating macro systems such as Lisp dialects. 

Metalua lets you see your code as concrete syntax or abstract syntax trees, indifferently. It also lets you switch from one view to the other. If you have already use multi-stage metaprogramming languages such as Template Haskell, MetaML or Converge, you'll be immediately at home.

Another key difference is that Metalua handles syntax extensions through a parser combinator library. I would argue that It's more flexible / modular than PEGs: it lets you easily define functors, can be modified dynamically, and lets you (relatively) easily generate proper error messages, when incorrect inputs are entered. However, it might have a slightly steeper learning curve for simple tasks; at least, it looks more intimidating.

This issue is acknowledged and I'm working on it: a DSL to define / extend grammars is under development. An example, a JSON parser, is accessible here (look at the stuff between <<  >>).

Beyond this, the main limitations are:
- macro hygiene is handled in a less than ideal way. I would love to adapt Clojure's brilliant approach to the issue, but it's difficult without breaking Lua compatibility (this is not relevant to other systems, which work at a much lower level)
- installation of version 0.4 is awfully messy, largely because the compiler is partly written in itself and therefore requires a bootstrapping step. A reorganized version is accessible here, although unreleased (click on the download button):

-- Fabien.