lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


Much of the Lua interpreter has really impressed me. I was also really
impressed at how easily I was able to add a little bit of extra
functionality.

On the other hand, perusing the code does lead me to ask whether there is a
space efficiency issue with Lua. It looks to me like one is faced with the
following sizes:

TObject: 16 bytes. A type value followed by a double which because of 8 byte
alignment forces the effective size up to 16 bytes.

Node: 40 bytes. 2 TObjects and then a pointer. Padding out to an 8-byte
boundary takes us to 40 bytes.

How much data is there really in a Node? Assuming 32-bit pointers, there is
essentially 1 + 8 + 1 + 8 + 4 = 22 bytes of information. 24 bytes on a
64-bit architecture. That's a wasting of 40-45% of the space.

Has this arisen as an issue in practice or am I just seeing a theoretical
potential issue? Cutting memory consumption is a performance win from the
standpoint of data caches and potentially from the standpoint of garbage
collection frequency.

Has a more densely packed Node ever been considered or would that just lose
any gains by complicating the code that deals with Nodes?

For example, a Node could be represented as:

    typedef struct Node {
        Value k_value;
        Value v_value;
        struct Node *next;
        lu_byte k_tt;
        lu_byte v_tt;
    } Node;

On a 32-bit architecture, that consumes 24 bytes which is close to optimal.
On a 64-bit architecture, that consumes 32 bytes and hence is less of a win.

Is this worth playing with?

Mark