lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]

On 12/4/06, Mark Hamburg <> wrote:
    Mutexes.withMutexDo( mutex, function()
    Mutexes.withMutex( mutex ) do
    Mutexes.withMutex mutex do

If you've got compile-time macros, whatever syntax sugar + content manipulation can be done through them. You should avoid to use macros when unnecessary, but for stuffs that alter the way you think of your whole program, such as concurrency, I think it's worth creating a couple of extensions, akin to Java's synchronized sections. So introducing a macro reading this seems right to me:

-- macro creating the "lock ... do ... end" syntax:
-{ block:
   mlp.lexer.register "locking"
   mlp.stat.add{ "locking", mlp.expr, "do", mlp.block, "end", builder = my_mutex_builder }
   function my_mutex_builder(x) ...[YOUR BACK-END CODE HERE]... end }

my_mutex = ...
locking my_mutex do

a "do" that appears on the same line as the end of a function call

I'm not sure that introducing some on-the-same-line condition in the parsing is doing a favor to the user :) (although I know there already is such an issue in Lua syntax with parentheses)
    ORB.entity "MyEntity" do
        field "myField"
        hasOne{ fieldName = "childName", childType = "ChildType" }

I'm not sure to get you here, but if I do, you're looking for something similar to pascal's "with" operator, aren't you? That should be easy to hack with setfenv(), and optionally a bit of sugar around it.
1. and  2.

I don't really understand what you mean

3. This is designed to look like a syntactic construct but note that it
changes the meaning of "return"

That's why you need AST manipulation. AST are already in Metalua, it's relatively easy to manipulate them with structural pattern matching (extension provided in the samples:, and I'm working on a real AST navigation library.

garbage collection or some form of reference counting

AFAIK, reference counting has terrible performances... And even if closures tend to hurt performances, follow the no-premature-optimization path:
- write it the way you'd like to read it if it was someone else's code
- profile the bottlenecks
- optimize these bottlenecks, possibly by getting rid of nice looking but inefficient idioms, possibly by putting these bits in C.

Finally, my guess is that this isn't something that can be done with token
filters unless one builds in support for the whole grammar.

That's more than syntax sugar indeed, but that's precisely what metalua's designed for.