lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


On Mon, Dec 17, 2018 at 2:43 PM Egor Skriptunoff
<egor.skriptunoff@gmail.com> wrote:
>
> On Mon, Dec 17, 2018 at 8:36 AM Coda Highland <chighland@gmail.com> wrote:
>>
>> I disagree with introducing special meaning to consecutive such
>> symbols. If they're supposed to be "ignore me" symbols then they
>> should be, y'know, ignored. Which means the second one should be
>> ignored the same as the first one.
>
>
> I don't see a point in your objection.
> Following your logic, if "minus" symbol means "unary negation", then two consecutive minuses must mean "unary negation and one more unary negation"?
> Right?  :-)
> Is there big difference between "minus" and "backtick"?  :-)
> Do you want to say that some composite lexems ( --  <<  >>  ::  .. ) are confusing because their meaning don't match the meaning of symbols they are consisted of?
>

No, I don't make that argument. In fact I find the comparison somewhat
disingenuous.

My complaint is ONLY relevant because the explicit meaning of this
particular lexeme is proposed to be "ignore me". The intended purpose
is for that lexeme to be filtered out of the byte stream before the
code is even tokenized. But by enabling its use in a compound lexeme,
suddenly the lexer has to determine if the character is an "ignore me"
non-token or if it's part of a "don't ignore me, just don't execute
me" annotation token. And not only does the lexer have to do this, so
does any human reading the code. "How many of these in a row are there
here?" when the contract of a single one is supposed to be "it doesn't
matter if I'm here or not, I don't do anything."

That said, consecutive minuses actually do have this issue as well.
The other examples you give don't, because they're all binary
operators, but negation and predecrement are both unary prefix
operators. There's an ambiguity introduced in the grammar and the only
way to resolve that ambiguity is by fiat. The C language specification
resolves it by instructing the lexer to take the longest string of
characters that forms a valid token, even if a different parse would
have resulted in a legal parse (for example, "2--3" is "2 -- 3" and
not "2 - -3" even though the former is illegal and the latter
evaluates to 5) but the only reason we actually put up with it in
modern programming is because of historical precedent. That's not an
excuse to make the problem worse; rather, we should take the
opportunity to learn the lesson.

/s/ Adam