[Date Prev][Date Next][Thread Prev][Thread Next]
- Subject: Re: Inline Functions
- From: Björn De Meyer <bjorn.demeyer@...>
- Date: Sat, 04 Jan 2003 23:27:26 +0100
Peter Hill wrote:
> Björn De Meyer:
> > Hmmm... it looks pretty good. Some small comments:
> "dstring" & "sstring" are now merged.
> "longstring" & "skiplong" are a different matter. They perform different
> functions (one collects a string, one discards input) so to merge them would
> weaken the modularity by requiring "Control Coupling" which is not a good
Well, I think I must agree with you here.
It is be easier to change the delimiters for the multi-line
comment if need be, because we have separate functions.
> Ok, I've added a line count and placed the getlex function in a wrapper that
> holds the internal state. This function is now returned by an initialise lex
> function, which is itself now returned by "lua.lex".
> Thus, to use the module, one would do:
> lexinit = dofile("lex.lua")
> -- to load the lexical system
> getlex = lexinit(z)
> -- set a lexical analysis going on the pre-created stream 'z'. The lexical
> -- analysis process is reentrant so one may have multiple lexical streams
> -- being analysed at once if one has a reason to want to.
> t,x,l = getlex()
> -- called multiple times to get the next lexical item.
> It normally returns: the token type (as a string, eg "le" for "<="), the
> extension data (if the token is a non-reserved word, a number or a string),
> and the current line number (defined after scanning the current token).
> If an error occurs it returns: nil, the error description (as a string), and
> the current line number (up to where parsing stopped).
This is a very interesting functional programming approach.
I think the way the lexer now works, always returning token,
value and line number is great. Still, may I ask why you
didn't go for a more classic OOP design? Not that that
would have been much better or anything, but OOP is more
popular these days than functional programming, so
I'm just wondering.
> In a polymorphic multi-return-value language there is no real excuse for not
> just reporting the error as a return value. As you see above I've now done
> errors that way instead. Is that ok?
It's good. That way, the parser can handle lex errors and
parse errors more uniformly. Although, it may slightly commplicate
the design of the parser, as now, essentially, you now have an
extra token, namely the "error" token.
> So should a longstring "[[CR LF blah ]]" fail to ignore the leading newline
> because a CR character preceeds it?
Essentially, yes. But, if you can make the lexer recognise
all three different line ending styles, then that may be easier
on those who want to use Lua on a non-unix-like platform.
> > I'm looking forward to your lua-in-lua parser.
> Once your happy with the "lex" I'll give it a go.
Well, apart from some more bugfixes, and maybe the lf/crlf/cr
line ending issue, the lexer looks quite good. I'll let you
know more once I did some more thorough testing.
"No one knows true heroes, for they speak not of their greatness." --
Björn De Meyer