[Date Prev][Date Next][Thread Prev][Thread Next]
- Subject: Re: The removal of function environments: An opportunity for optimization?
- From: Edgar Toernig <froese@...>
- Date: Sun, 23 May 2010 23:37:44 +0200
Mark Hamburg wrote:
> Does this mean that the compiler could now detect functions without
> upvalue references and treat them as constants so that we don't
> reconstruct them over and over again?
That would be really nice, especially if no memory allocation
would take place at all.
A further optimization could be a common closure object for all
instances of a function with the same upvalues:
for _,v in pairs(a) do
bar(v, function(x) print(x) end)
baz(v, function(x) print(label, x) end)
The closure passed to bar could always be the same one, even
on different invocations of foo.
The closure passed to baz could be the same for all iterations
of the loop.
[Unfortunately, the iteration variables are reinstated for
each iteration so nested functions using them would always
have unique closures.]
I'm thinking about a single-entry cache in the funcproto
pointing to its last created closure (for the GC a weak ref).
When creating a new closure, one checks first if there's a
cached one and if it has the same upvalues. If that's the
case, it could be reused, if not, a new closure is created.
What I'm not sure about is, whether these checks are cheap
enough. Closure creation stays at O(nupvalues) but the O
gets bigger. And the GC has a little bit more work to do.
One case where the optimization will not happen is if i.e.
baz calls foo recursively. The single-entry cache would
only optimize the most deeply nested call.
On the other hand, with the new environment I think a lot of
functions will a have an upvalue for _ENV.