lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


On Wed, Apr 26, 2017 at 2:08 PM, Hisham <h@hisham.hm> wrote:
> On 25 April 2017 at 10:48, Roberto Ierusalimschy <roberto@inf.puc-rio.br> wrote:
>>> BASIC on the Apple II was in many ways the ideal beginner's language.
>>
>> Dijkstra famously said:
>>
>>     It is practically impossible to teach good programming to students
>>     that have had a prior exposure to BASIC: as potential programmers
>>     they are mentally mutilated beyond hope of regeneration.
>>
>> I am fully convinced that this citation did (and still does) a huge
>> disservice to the teaching of programming.
>
> I don't dispute that, but as someone who started programming with
> BASIC myself (on the Apple II, no less!) I can attest that this
> citation has at least some grain of truth. To this day I think very
> concretely — but take into consideration that I learned that x=2 is a
> stateful variable assignment _years_ before I learned that x=2 is a
> mathematical equation (yes, I learned programming before elementary
> school algebra). In general I was able to overcome it — I was still an
> A-student in math in elementary school and I don't recall getting
> confused, but to this day I have a major problem with the shortcuts of
> more advanced mathematical notation. I realized that I parse math like
> a computer, so to me a lot of the stuff that math teachers write in
> their blackboards are full of (what today I know how to call) type
> errors. In fact, my mind thinks so mechanically that I remember one
> instance where I turned in an assignment to you in Semantics class and
> you asked me if it was done using proof-assistant software, because
> every single step was spelled out no matter how trivial it was. :)

I learned how to deal with those type errors in physics class:
dimensional analysis! Once you've annotated everything with
sufficiently descriptive typing, it DOES end up working out!

> I do remember one occasion in which this
> mechanical-imperative-interpretation kind of thinking caused me
> trouble, though. The first database I learned was dBASE during high
> school, in which you always had to iterate tables using WHILE loops.
> When I got to the Databases class in college I had the _hardest_ time
> learning SQL, because my brain simply could not accept the magic that
> was going on in queries. Since I couldn't produce a mental model for
> executing those queries efficiently myself, my brain couldn't produce
> solutions that involved nested projections, selections and joins
> without being mind-boggled by all the gigantic intermediate tables
> that conceptually exist in the naive explanation of the model. I kept
> approaching problems in that class as "which table should I loop over
> first and where do I store this data". Since then, I got better at
> abstraction, but I think imperative interpretation will always be my
> most natural way of thinking.

I had a very different experience. My dad is a database professional
so I learned SQL as a child because I was curious about what he was
doing and he was happy to teach me. I learned the theories and best
practices of database work long before I knew the formalisms that
drove them. (Learning tuple relational calculus in college was very
enlightening.)

In the end, I had to learn to do exactly what you had to teach
yourself to stop doing as my own career progressed. Thinking about
databases purely declaratively gave me trouble when I started dealing
with nontrivial datasets. I had to learn how the query compiler
transformed my declarative statements into sequential actions so that
I could make sure it did things in the most efficient order. (To wit:
exclude as much data as possible as early as possible so the joins
have less work to do.)

/s/ Adam