lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


We had an unlimited memory growth in ldo.cpp under certain circumstances in lua 4.x. The problematic code was increasing the GC threshold in calls to lua parser: L->GCThreshold += (L->nblocks - old_blocks);. This line of code had caused the GC being never performed under certain circumstances. Needless to say we had to remove this line. Another thing with the parser is the crash in parse_file function (the next one down in the same file). The call to lua_pop() has a comment that assumes no GC during parsing. This assumption didn't hold under certain circumstances either! As a result GC would collect the filename string leading to a crash. The call had to be moved 2 lines down the source (or reference could be used for it as an alternative solution). 


 -----Original Message-----
From: 	Thatcher Ulrich [] 
Sent:	Tuesday, January 07, 2003 6:41 PM
To:	Multiple recipients of list
Subject:	Re: The memory usage in LUA intensive calling appz

On Jan 08, 2003 at 02:42 +0100, Zdenek Stangl wrote:
> Hello,
> Im new to this list. I've read few posts on memory leaking and garbage
> collecting. Now I have just one question: Is it possible to get growing
> memory usage in application that calls frequently lua script functions,
> pushing new tables with strings and cfunctions ? Freqently = up to 50-times
> per second.

Yes, absolutely, if you generate new data that remains accessible.
For example:


history = {}

function handle_request(req)
  -- req is a request to be served

  -- save req in the history buffer
  tinsert(history, req)



If do_command() gets called repeatedly, the history table will keep
growing and consuming memory, which can't be collected, because it's
still reachable.  If instead you periodically do "history = {}" to
make the old data unreachable, or you don't use the history table at
all, and don't otherwise keep links to old data, the garbage collector
should be reclaiming that memory.

The symptoms you describe sound a lot like you're somehow keeping
links to generated data.

Thatcher Ulrich