lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


Dear All,

I've build a (Lua)script interperter on an embedded system. Works like a
charm, but I have an anoying problem. The script I run is actually 2
scripts. The first is loaded and run only once. It declares a global 2D
array of some userdata (alloced in C). The second script is loaded and
run to completion evry time (in a repeated cycle). This second script
doesn't use the 2D array, but does have it "visible" in its registry
(e.g. I could use it, if I want to).

The problem I have is this:

- the 2D array is 20 columns x 10 rows (200 elements)
- If I allocate userdata to it, the program cycle takes typically 30 to
40msec but every n interations (were n could be any were from 15 to 25)
it "stops" for 1700 to 2500msec!!!
- If I allocate the same 2D array, but instead of allocate userdata I
simply write integer numbers to each table entry, the cycle time somehow
drops to only 20msec, remains constant and I never ever experiance the
1700 to 2500msec delays.

After extensive testing, it appears that everything is functionally
correct, e.g. no memory leaks, out-of-boundaries and the likes. It
apperas that the garbage collector has somehow have something to do with
it? Although I dont' know why, since the 2D array is allocated only once
(and never referenced) for the lifetime of the program. I also never
observe __gc events on the 2D array, unless, of course, when I quit the
program.

I want to understand this "random delay" behaviour. Is and why is the GC
involved? When / were is this invoked in the Lua source code, since I
only call lua_resume()? I can worsen / lighten the delay effect by
increasing the nr of columns, making the delay hardly noticeable or so
long that it appears to hang forever.

Any help is appreciated.

Max