[Date Prev][Date Next][Thread Prev][Thread Next]
- Subject: Explicit Delete vs Garbage Collection. Was: Re: About lua_newthread
- From: "Daniel Collins" <daniel.collins@...>
- Date: Fri, 9 Jun 2006 10:08:15 +0930
> No no, .NET's Dispose() is just their standardization (sp?) of the
> interface for releasing resources owned by classes. It isn't directly
> related to the lifetime of the object.
This is correct. Dispose lets objects release system resources such as
files while they wait for garbage collection. Dispose does not make
actual collection of the disposed object happen sooner, but it does make
the disposed object unusable.
> A fairly common approach in games and GUI systems is to mark
> objects as "dead" (i.e. wants to be deleted). Everyone who references
> check for this during regular operations, and eliminate their own
> reference if the object is flagged as dead. Simple and lightweight.
This is certainly common in games I have worked on. I have two
occurences of systems like this so far. For the first, I have a number
of update lists. Objects register a function and that function gets
called once per game update. Objects can also register a timed callback,
specifying the number of milliseconds delay. At some point the callback
function is no longer required so objects can unregister the function.
This does not remove the function right away but sets a flag requesting
removal. The callback manager then iterates the list in a post-update
step and removes all flagged functions.
The second instance is in the GUI system. Menu screens have an explicit
delete method which can be called after that screen exits on screens
that will never run again (or that are very unlikely to run again). This
is because the GUI system holds a reference to all screens inside the C
API, so screens will never be collected normally. I havent made this
table have weak values because I wanted to enable constructs like the
following for simple menu screens where the return values of all the
factory methods are not stored permanently in any lua variables:
local screen = menuscreen.new(menu.createScreen(parent,
menu.createButton(screen, 20, 30, "Do something" ....)
The delete method causes the UI system to iterate all children of the
screen object, deletes all of those, removes all callback functions and
However, just thinking about this now makes me think it is a bad design.
If I had the requirement that the lua code should maintain a reference
to the menu screen until it is no longer needed (there is a table of all
menu screens already), then I could just add a __gc metamethod to the
screen userdata, make the C-side table weak and move all the cleanup
from the delete method into there. This would make things much more
Except for one thing.
My menu screen objects are not the menu screen userdata. They are a
table that holds a reference to the screen userdata in a field. This is
so I can add an extra layer of functionality that is not provided by the
C UI system.
Will collection of a table automatically trigger the collection of all
objects referenced by that table (assuming they are the only
references)? If so, and I think this is the case, then I can make my UI
system much more lua like by removing the delete method.
There is still a flaw in this system though. The UI is hierarchical. The
screen userdata has all the widgets as child objects, and children
cannot be orphaned. If the parent screen is deleted, all children *must*
be deleted. This is all on the C side and Lua doesn't know about this.
So if any lua objects other than the screen table held references to the
children of the screen userdata, things could be left in an inconsistent
state when the screen userdata is collected. The only way I can see to
manage this is by self imposed rules on how these objects should be
used. The only other way is to never collect any of the menu screens,
but there just isnt enough memory to do that.
- DC (the long winded)