lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


Jamie Webb,

On 8/12/05, Jamie Webb <j@jmawebb.cjb.net> wrote:
> Your central argument seems to be 'symbols are easier to read than
> words'. I think this is simply false. It's not even a case of personal
> preference.  ....

Yes, that is what I believe, and that explains what I am doing.
I perfectly well know that there are people of other opinions --
I said that in the beginning of my original posting in this thread.
But I do not believe that I offend anybody if I have, and sometimes
manifest, my own opinion.  Explaining what I am doing does not
mean that I am trying to proclaim universal truths.

> You have created a set of symbols that are mnemonic for you. Because
> you invented them. Anyone else reading your code /even another
> symbol-lover/ will have to spend the time memorising e.g. that :cr:
> means coroutine.resume, and not carriage return. Likewise, people
> writing in your style will have to memorise that math.max translates
> to :/\:, and not :>?:, :^:, or any number of equally suggestive
> possibilities.

Frankly, I think that :cr: has zero chances to be interpreted as a
carriage return in the specified context, and I already said elsewhere
how those names inside :: are formed, but you know better...
Yes, in general, it surely takes memorizing, but one usually does
not spend her life doing that.  Check other languages, such as
APL, J, K, and you will see that people actually deal with _much_
larger symbol sets.  Or, for that matter, check Perl, or even C.
So what if memorizing is needed?  Does that prove that a wordy
syntax is better readable?  Because you claim so.
(Ah, and what if a person does not know English well, but still
wants to program?)

When I read a wordy program, I feel I tend to do it linearly.
If there is more varied, symbolic syntax, my sight is easily
caught by the symbols I am chasing, so it easily hops right
there, skimming over some portion of the text.

In my understanding, writing programs is like writing formulae.
Correct and unambiguous.  And concise.  Mathematicians have
spent centuries in inventing and developing notation(s) to write
formulae.  They do not do that to keep outsiders out; they do it
because they feel how important this is.  And I believe we
programmers should do the same -- only this is much more
difficult for us because programming is such a demanding and
challenging new discipline.

> A similar argument against operator overloading I came accross in a
> C++ book: suppose we have a class representing sets, and we write 'x &
> y'. Does it mean union or intersection? If you follow the mathematical
> logic it means intersection, and if you follow the typical English
> meaning it sounds more like union. So, even though it seems quite
> reasonable to use a symbol for this common set operation, it only
> introduces confusion just because peoples minds work in different
> ways.

Correct, but that is resolved by learning and using different contexts
as needed (which a programmer does anyway).  After all, operator
overloading is indeed used, without much complaints.  Note that,
if you were right, we would have to abandon &s in logic as well,
on the same ground that it `introduces confusion'.
 
> Likewise, I've read numerous articles pleading with academics to
> define their notation when writing theoretical papers, so I'm clearly
> not alone in finding it extremely frustrating when authors just assume
> that everyone else understands their made-up notation, or the notation
> that they assume is standard because their teachers used it.

I do not recall to have assumed that my notation is standard or
readily understood by anyone.  It was just a proposal, and
something to provoke discussion, suggestions, and pointers
to similar things.  You are putting words in my mouth.

> Even
> something as trivial as the order of function composition isn't
> universal!

Isn't function composition always associative?  Because when
it is, the order is irrelevant.

> So, as long as you are the only person who has to deal with your code,
> abbreviate to your heart's content. But don't claim that the result is
> more readable since no-one else will find it so. Because if they were
> the ones inventing the symbols, they'd come up with a different set.

The statement of the last sentence is probably true, but the word `because'
at its beginning is out of place.  There is no logical implication linking this
sentence to the previous one.  And that previous sentence contains an
unjustified statement: how do you know about `no-one else'?

Have fun,
    Boyko