Just a quick query to satisfy my curiosity about the new incremental
garbage collector: if I have a state with a significant amount of
collectable objects, does incrementally collecting with lua_gc("step")
guarantee that it will definitely be collected at some point?
The reason I ask is that the behaviour I'm seeing from the lua VM is
that the GCthreshold remains constant at some high value, even though
the garbage collector has been through several sweeps. If I call
lua_gc and make it do a full (non-incremental) collect, it manages to
clear up all of the collectable data. If I don't do a full collect,
the data is never collected. No combination of parameters to SETPACE,
SETINCMODE or STEP seems to make a difference.
I would guess that I'm just misusing the system, but all I'm doing is
starting the incremental collection at load time (GCRESTART), then
constantly calling SETPACE, SETINCMODE and then STEP (in that order)
once every tick.
Any pointers to my obvious mistake?
You probably didn't make one. Tuning an incremental
gc is a difficult feedback problem. It is unlikely --
and unnecessary --- that it guarrantee to eventually
collect garbage.
What one would like to guarrantee is that the amount
of garbage will converge to some reasonable number
when the program is in an equilibrium state, and
that there will be some reasonable kind of upper bound on
total garbage (at all times).
There has been discussion on this recently.
It seems the collector feedback 'circuits', and not
just the control parameter values, still need to be fiddled
with to achieve reasonable performance -- something which
is quite hard to define in the first place.
Roughly speaking, the collector needs to collect >= garbage
generated over a period of time, which means when you start
spawning a lot of memory, it needs to ramp up automatically.
However it also needs to know how much garbage exists and
how much it collects to be self tuning -- and the first
piece of information cannot be available except after
a full collection.
To put this another way, you have a source generating
both signal and noise and want to filter out the noise,
but unfortunately .. you don't know what the signal
looks like .. until after the noise is removed.. :)
This can work using prediction based on prior performance,
but it isn't easy to get right, particularly since computer
programs don't obey any simple physical laws :)
--
John Skaller, mailto:skaller@users.sf.net
voice: 061-2-9660-0850,
snail: PO BOX 401 Glebe NSW 2037 Australia
Checkout the Felix programming language http://felix.sf.net