[Date Prev][Date Next][Thread Prev][Thread Next]
- Subject: Re: Lua on Reddit again
- From: Henk Boom <henk@...>
- Date: Wed, 2 Feb 2011 13:54:46 -0500
On 2 February 2011 02:10, Dirk Laurie <email@example.com> wrote:
>> Andrew Lentvorski <firstname.lastname@example.org> wrote:
>> > I guess it's because I think in terms of ranges. Things like micrometers
>> > and calipers were always natural to me while other people had a horrible
>> > time with them. Shrug.
> Even micrometers and calipers, if you are measuring 10 units,
> mark the place where you stop with "10", not with "9".
> Tony Finch wrote:
>> I agree with you and Dijkstra :-)
> I think of ranges in terms of first element, last element. First
> element is not always at the index origin. The moment you need
> to specify it explitly it is no longer important whether it is 0
> or 1 or whatever.
> But that particular quote shows Dijkstra shooting himself in the foot.
> He starts off: “To denote the subsequence of natural numbers 2, 3, ...,
> 12 without the pernicious three dots, four conventions are open to us”,
> these being the four ways of combining one of < or <= with another.
> In Pascal one can write 2..12 — but dots are pernicious, and besides
> the notation was invented by Wirth!! So Dijkstra does not even consider
I often have to convert between continuous and discrete ranges when
doing things such as bucketing for collision detection. When working
with [bottom, top) floating-point ranges where bottom and top are
integers, mapping them to [bottom, top) integer ranges means that you
are working with a unified convention, so you can always know that
top-bottom is the length of the range, for example. However, systems
with one-based arrays favour a [first, last] approach, so you end up
being encouraged to use [bottom, top-1) as your integer range, which
means you're using two different conventions and two different sets of
operations in one code-base. It seems this problem fundamentally comes
from the fact that, while counting from 1 works well for integers,
when dealing with (pseudo-)continuous numbers, you almost never want
to start with 1.
I'd be the first to say that one-based arrays shouldn't stop you from
learning lua, as in much code it makes little difference once you're
used to it. From a language design point of view, though I can't make
a case for one-based arrays. I often run into cases where one-based
arrays lead me to more complicated code, as they're less suitable to
continuous<->discrete mappings, but I have never run into an algorithm
which is _easier_ to implement with one-based arrays, other than the
array-based heap-sort algorithm.