lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]

On Tue, May 08, 2007 at 04:01:52PM -0300, Roberto Ierusalimschy wrote:
> > In this case, I think the argument for having #t return the real table
> > size isn't that it adds a feature, but that it removes an annoying
> > special case, leading to a more regular and predictable language.

> I don't think so. For instance, many programs use tables with both
> "keys" and "indices", such as this:
>   polyline = { color = "red", thickness = 4;
>                {0, 0}, {3.5, 4.3}, ...
>              }

> When they iteract over the indices, they want the size of the array part,
> not of the entire table.

Sometimes they might, sometimes they might not:


  for point in ipairs(polyline) do
	  drawto(point, polyline.color, polyline.thickness)

And is that more common than something like this?

	students = {
		["Sam"] = {grade="C+"},
		["Roberto"] = {grade="A"},

    print("class size", ... )

I really don't see why a collection with a subset of its keys in
numerical order from 1 to n with no holes needs a membership count more
than a collection keyed in other ways. Both are useful.

> Lots of people complain that #t does not handle holes in an array.
> Returning the "real table size" (that is, the number of elements in the
> table) would certainly not improve this situation.

I think it would. Instead of #t becoming unusable:

   If the array has "holes" (that is, nil values between other non-nil
   values), then #t may be any of the indices that directly precedes a
   nil value (that is, it may consider any such nil value as the end of
   the array).
     - 5.1 refman

it would become a predictable value, whether you have holes or not.