[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Re: GC and userdata objects
- From: Dibyendu Majumdar <mobile@...>
- Date: Thu, 12 Apr 2018 06:31:49 +0100
On 11 April 2018 at 22:45, Roberto Ierusalimschy <roberto@inf.puc-rio.br> wrote:
> So, keeping some huge userdata is not particularly relevant to the
> GC. What is relevant is the allocation of a huge userdata. You can
> communicate this to the GC by calling collectgarbage("step", x) with 'x'
> proportional to the size of the userdata.
>
Hi. I have tried using GC step before but did not quite succeed in
resolving similar issues but that could have been due to my lack of
understanding. Is there an example of how it should be used?
The scenario is this - I am creating a AST generator for Lua; the AST
is captured in a userdata. The AST is generated every time some code
is parsed - so you can imagine that this can be quite frequent. Say in
a test program, many code chunks are parsed resulting in several
userdata objects of different sizes being created. I noticed that the
GC does not collect the objects fast enough. In this scenario, how do
you suggest I should use the GC step parameter? Keep setting it to
different values based on userdata sizes? Or keep track of total
userdata memory and set this accordingly? Obviously the rate of
allocations varies over time, and sizes vary as well.
Thanks and Regards
Dibyendu