[Date Prev][Date Next][Thread Prev][Thread Next]
- Subject: Re: Inline Functions
- From: "Peter Hill" <corwin@...>
- Date: Sat, 4 Jan 2003 19:31:08 -0000
Björn De Meyer:
> Hmmm... it looks pretty good. Some small comments:
Thanks for the comments :-).
> 1) The functions longstring() and skiplong() could be folded
> into one function as theyir functionality is almost completelty
> identical (it is done like this in C Lua). The same goes for
> scan.dstring() and scan.sstring(z,c). Again, their functionality
> is similar, so one flexible function for both would perhaps be
> more economic.
"dstring" & "sstring" are now merged. They should have been in the first
place but I failed to notice they already had convenient access to the
terminator characters (" or '). Hey, it was late *grin*.
"longstring" & "skiplong" are a different matter. They perform different
functions (one collects a string, one discards input) so to merge them would
weaken the modularity by requiring "Control Coupling" which is not a good
Also "longstring / skiplong" are only coincidentally similar, basically
because someone decided to model long-comments on long-strings. Nevertheless
the two syntaxes are not fundamentally tied together and some future action
might choose to deviate them further. Contrast this to the "dstring /
sstring" pair which are essentially the same conceptual operation differing
only in the termination character.
Because of this, and since "longstring" and "skiplong" are both so short
anyway, I've choosen to still keep them separate.
> 2) The lexer needs to keep the line count somehow. This is needed
> for signaling errors (also towards the parser. I think therefore,
> it would be more convenient of the the lexer would be a stateful
> object that remembers it's own linecount and lookahead character.
> Then you'd have lexer::get(), lexer::lineno(), lexer::lookahead(),
Ok, I've added a line count and placed the getlex function in a wrapper that
holds the internal state. This function is now returned by an initialise lex
function, which is itself now returned by "lua.lex".
Thus, to use the module, one would do:
lexinit = dofile("lex.lua")
-- to load the lexical system
getlex = lexinit(z)
-- set a lexical analysis going on the pre-created stream 'z'. The lexical
-- analysis process is reentrant so one may have multiple lexical streams
-- being analysed at once if one has a reason to want to.
t,x,l = getlex()
-- called multiple times to get the next lexical item.
It normally returns: the token type (as a string, eg "le" for "<="), the
extension data (if the token is a non-reserved word, a number or a string),
and the current line number (defined after scanning the current token).
If an error occurs it returns: nil, the error description (as a string), and
the current line number (up to where parsing stopped).
> 3) The error reporting is a bit rudimentary. For that, I think
> you could use a user-redefinable error function lexer:error()
> that defaults to the lua standard error() function. Or such.
In a polymorphic multi-return-value language there is no real excuse for not
just reporting the error as a return value. As you see above I've now done
errors that way instead. Is that ok?
> 4) In Lua, line ends are linefeeds, not carriage return-line
> feeds. It's considerably harder to have all three the possible
> line endings, namely only cr, cr lf and lf. Because of the
> historic background, in Lua it's only lf.
So should a longstring "[[CR LF blah ]]" fail to ignore the leading newline
because a CR character preceeds it?
> All in all, it looks well written. I'll test it a bit more toroughly
> later on.
I tested, changed, and debugged it somewhat. I'll post you the new verion.
If anyone else is interested just ask.
> I'm looking forward to your lua-in-lua parser.
Once your happy with the "lex" I'll give it a go.