lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


On Fri, Aug 12, 2005 at 06:51:55PM +0300, Boyko Bantchev wrote:
> Frankly, I think that :cr: has zero chances to be interpreted as a
> carriage return in the specified context,

Not if I thought about it, but that's what would spring into my head
upon seeing it for the first time. And it would take a fair amount of
inspection for me to figure out the actual meaning. Even if I'd never
seen Lua before, coroutine.resume leaves little to be guessed.

> Yes, in general, it surely takes memorizing, but one usually does
> not spend her life doing that.

At the moment, I regularly program in 5 languages (C, C++, Java,
Scala, and Lua). I have used probably another 30 to varying degrees in
the past. Mostly, the symbols and terminology are sufficiently similar
that I can switch between them with only a little trouble. Some of
them have different symbols for the same thing (<> != /= ~=...), and
the same symbol sometimes means different things in different
languages. It's a pain, but most language designers try to be
reasonably conservative when inventing operators, so it's manageable.
Imagine if all of the those languages replaced all their keywords and
common functions with more symbols, each according to the designer's
taste, and each trying to keep to the short sequences of characters.
Could your brain cope with that? Mine couldn't.

> Check other languages, such as
> APL, J, K, and you will see that people actually deal with _much_
> larger symbol sets.

Do you have an APL keyboard? Is it coincidence that those languages
are now almost extinct despite their undeniable elegance in some
senses? One could perform incredibly complex calculations in a single
line of code, but it would take an expert later to figure out what the
thing meant.

> Or, for that matter, check Perl, or even C.

Perl is great for throw-away hacks, because it is so concise, but Perl
code is widely recognised as being unmaintainable unless one follows
style guidelines that exclude many of the features of the language,
including much of the line-noise.

I do not consider C to be an excessively symbol-heavy language (there
are some I would drop, but not many).

> So what if memorizing is needed?  Does that prove that a wordy
> syntax is better readable?  Because you claim so.
> (Ah, and what if a person does not know English well, but still
> wants to program?)

Which is better to learn: words in a language used almost universally
by programmers throughought the world, or the set of symbols someone
cooked up for this week's task?

> When I read a wordy program, I feel I tend to do it linearly.
> If there is more varied, symbolic syntax, my sight is easily
> caught by the symbols I am chasing, so it easily hops right
> there, skimming over some portion of the text.

I think most people find that syntax-highlighting editors do a better
job of that.

> In my understanding, writing programs is like writing formulae.
> Correct and unambiguous.

Fair enough. But properly defined words do that quite well.

> And concise.  Mathematicians have
> spent centuries in inventing and developing notation(s) to write
> formulae.  They do not do that to keep outsiders out; they do it
> because they feel how important this is.

Mathematical notation was largely invented before the age of
calculators, spreadsheets, Mathematica, etc. Mathematicians expected
to have to do pages of working on paper to solve problems. That they
developed a concise notation for that is no surprise at all, and I
don't believe it has anything to do with communication. There is
nothing about a symbol than makes it intrinsically more meaningful
than a word. It's just a convenient shorthand.

> this is much more
> difficult for us because programming is such a demanding and
> challenging new discipline.

Yes. We actually have to work in teams. We have to write code that
others can understand. That's something that mathematicans over the
ages appear not too bothered about.

> Correct, but that is resolved by learning and using different contexts
> as needed (which a programmer does anyway).

How many contexts do you want to have to deal with? Wouldn't you like
to take as much as possible from one to the next?

> After all, operator
> overloading is indeed used, without much complaints.

Not entirely true...

> Note that,
> if you were right, we would have to abandon &s in logic as well,
> on the same ground that it `introduces confusion'.

No more than we have to abandon '+'. Operators clearly serve a
purpose, and I'm not arguing against those that /have/ become widely
accepted. Just against introducing new ones specific to a particular
language on a whim.

> I do not recall to have assumed that my notation is standard or
> readily understood by anyone.  It was just a proposal, and
> something to provoke discussion, suggestions, and pointers
> to similar things.  You are putting words in my mouth.

My point is that if you released code using this notation, and anyone
else had to maintain it, you would be assuming that they knew it, or
wanted to sit down and learn it. A great strength of Lua is that one
can understand many Lua programs quite readily having never
programmed in it.

> Isn't function composition always associative?  Because when
> it is, the order is irrelevant.

I'm talking about order, not associativity. I.e. does (f o g)(x) mean
f(g(x)) or g(f(x))?
(Where 'o' is the function composition operator... :-)

Regardless, it's not terribly relevant.

> > So, as long as you are the only person who has to deal with your code,
> > abbreviate to your heart's content. But don't claim that the result is
> > more readable since no-one else will find it so. Because if they were
> > the ones inventing the symbols, they'd come up with a different set.
> 
> The statement of the last sentence is probably true, but the word `because'
> at its beginning is out of place.  There is no logical implication linking this
> sentence to the previous one.  And that previous sentence contains an
> unjustified statement: how do you know about `no-one else'?

Of course there's an implication. By introducing symbols that have
only the meanings you assign to them, you introduce a learning curve
that was not there previously. That's a problem. Becoming fluent in a
language takes a long time.

You're an academic, right? The great thing about being an academic is
that if someone else doesn't understand your work, you are free to
believe that they're just not smart enough, or should try harder. In
the real world, they have to understand anyway, preferably without too
much hand-holding.

You really should listen to the advice given by numerous people in
this thread: think more of the people who don't have an insight into
the way your mind works.

-- Jamie Webb