[Date Prev][Date Next][Thread Prev][Thread Next]
- Subject: Re: Inline Functions
- From: "Peter Hill" <corwin@...>
- Date: Sun, 5 Jan 2003 14:13:34 -0000
>> lexinit = dofile("lex.lua")
>> -- to load the lexical system
>> getlex = lexinit(z)
>> -- set a lexical analysis going on the pre-created stream 'z'.
>> t,x,l = getlex()
>> -- called multiple times to get the next lexical item.
Björn De Meyer:
> This is a very interesting functional programming approach.
> I think the way the lexer now works, always returning token,
> value and line number is great. Still, may I ask why you
> didn't go for a more classic OOP design? Not that that
> would have been much better or anything, but OOP is more
> popular these days than functional programming, so
> I'm just wondering.
I learnt programming pre-OO so it's not my first approach and (still being
in its youth) standard OOP has some aspects that need to evolve before I'll
be fully happy with them. Additionally, Lua only loosely supports OOP so I
thought I'd try using the new "lexical scope" instead.
And the current method isn't actually a functional approach since "getlex"
takes no arguments!
Indeed, these new lexical-scope functions (which can now store an attached
hidden internal state) can act more OO than tables. Ie, functions become
objects in themselves. This is the case with the "getlex" function which
(with its internal char lookahead & line count) is in essence an object with
just one (default) method! Weird!
>> In a polymorphic multi-return-value language there is no real excuse for
>> not just reporting the error as a return value. As you see above I've now
>> done errors that way instead. Is that ok?
> It's good. That way, the parser can handle lex errors and
> parse errors more uniformly. Although, it may slightly commplicate
> the design of the parser, as now, essentially, you now have an
> extra token, namely the "error" token.
A good point. Passing back errors (rather than executing a trap) gives you
more control but takes more effort. I'm not sure how much this will affect
the parser but hopefully it won't be too much.
I'm thinking of trying to write the parser completely as an operator-based
(bottom up) system, in which case there should be no extra overhead.
>> So should a longstring "[[CR LF blah ]]" fail to ignore the leading
>> newline because a CR character preceeds it?
> Essentially, yes. But, if you can make the lexer recognise
> all three different line ending styles, then that may be easier
> on those who want to use Lua on a non-unix-like platform.
Hmm, that could be a bit of bother so I might skip it for the moment. A
simpler solution (from the lexical analyser's point of view :-) is to put
the conversion in the Stream function so that the lexical analyser only ever
sees the "standard" newline representation. This might even already happen!