lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]

I have been working this week on my Lua project (running on fairly puny CPU). I hit a couple of issues that surprised me and I am struggling to understand the internals to explain the effects I am seeing. Any thoughts or explanations  would be greatly appreciated.
Issue 1)
A few weeks back, my Lua source code was a single file of 120kbytes or so. This took 2 seconds to load and compile to bytecode. (The time to execute  luaL_loadfile(L_user, filename))
As the source file was getting too big, I split it into a main source file of 60kbytes or so and several require files.  The source code has expanded a fair bit at the same time, so when I next measured the time to execute  luaL_loadfile() I was expecting to see well over 2 seconds. What I actually measured was 1 second.
This wasnt anticipated at all, as I assumed the luaL_loadfile() phase reads in all the require files and compiles the whole program to bytecode inside that function.
The timings seem to suggest that is not the way it works? I am complelely guessing here now, but the other possibility is that when the subsequent lua_pcall() function runs, it must load the require files and compile to bytecode at that point, before it runs the bytecode. If that is the case, when are the require files loaded, always right at the beginning of pcall() or does it depend on where they are placed in the Lua source ?
Am I way off the mark here, or roughly on the right track ?
Issue 2)
While I was timing some functions  I also measured the time to create the lua state and open the standard Lua libs and my custom library bindings. This took about 200 mS, that is roughly what I would have guessed and no big deal.  I then also measured the time to execute the lua_close(L) line. Slightly to my horror this took 8 seconds !  In my system I need to stop and start the Lua task quite often, so this very sluggish close time is a nasty problem.
I was anticipating the garbage collector destroying objects to be quite complex and slow, but didnt think it would be that bad. From a peruse of the code it looks like it traverses a linked link of objects and then frees them one at a time from the system memory pool. I put some debug code in to try and understand what is going on, it would seem that the lua_close() function triggers the destruction of over 4000 objects, (so is taking approx 2mS per object to free)
If I measure the amount of object freeing for a single line print("Hello") Lua program I get
objectFrees[LUA_TPROTO]=2                // Anyone know what a TPROTO is, I cant guess what that one is ?

Total objectFrees=1,294     (this is about 400 for a standard Lua build with no custom libraries)
So that is pretty big too, closing the lua state for a do nothing program would probably take 2.6 seconds or so.
Any thoughts on these numbers, do these object counts look plausible ? , any workaround ideas how to speed up the lua_close() or minimise the object counts ?
I have a nasty feeling this is just something I am going to have to live with.
Regards Geoff