[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: RE: Table performance
- From: "Kevin Baca" <lualist@...>
- Date: Tue, 4 May 2004 22:11:43 -0700
I agree with everything Ashwin said. I also use Lua for UI layout and
rendering, plus much more, including some physics, resource management,
event and message passing - the list goes on.
The only spike that ever shows up in a profile is Lua's GC, which points
to the need for careful object management above all else.
I will note, however, that the incremental GC implemented in 5.1-work0
has improved Lua's profiling results.
Additionally, the same sorts of peep-hole programmer optimizations that
improve C/C++ code also help Lua code. For example, instead of doing
this:
begin loop
this.map.tileList:draw( x * this.map.cellSize.x, y *
this.map.cellSize.y, this.map(x,y) )
end loop
Do this:
local map = this.map
local tileList = map.tileList
local sizex = map.cellSize.x
local sizey = map.cellSize.y
begin loop
tileList:draw( x * sizex, y * sizey, map(x,y) )
end loop
This effectively reduces table accesses per loop from 10 to 1.
It not only makes for faster code, but also more readable code.
-KB
>
> > What I'm looking at the possibility of is a game creation
> systemthat runs lua entirely. This means some performance
> critical drawing
> > loops that typically would be in C++ could end up in Lua.
>
> I hear what you're saying. My 2 cents worth: I think Lua is
> excellent for creating/rendering/driving graphical user
> interfaces. Here's why:
>
> For an information management system I've created a Lua-based
> runtime that handles (amongst other things) all the graphical
> UI rendering. So, it does not rely on any Windows controls
> for rendering whatsoever. Everything you see [buttons,
> listviews, fields, tables, scrollbars, &c.] is controlled and
> rendered through Lua.
>
> Now, I don't mean to toot my own horn, but... this setup
> works *really* splendidly [toot-toot! ;-)]. Even fairly
> complex UIs run & render very smoothly. And that's on the
> 400Mhz machine I mentioned in my previous post.
>
> Obviously, you do need to know what you're doing. So, I think
> it's an excellent idea to run the tests you're running right
> now. It's pretty much the way I set out, way back somewhere
> in 2002. Do keep in mind that it might be much more
> worthwhile to run more 'practical' tests, though. Maybe you
> could wrap a few of your rendering primitives and see how
> well it goes?
>
> Just to be clear: creating the runtime I (partly) described
> was not accomplished in a mere few days. In fact, quite a lot
> of water went under the bridge before I was satisfied with
> the results. But the effort was worth it: the end result is
> an extremely thin and very flexible runtime, that's a dream
> to program.
>
> For example, I can create a new UI in a few lines of Lua
> script, or write 1 line to load a form definition (written in
> Lua table syntax) instead. And if I ever need to tweak basic
> widget behaviour [a very rare occurance, these days] I can
> simply open its script in Vim, make the changes, close my
> application, start it again, and start testing.
>
> That's right: no compiles, no links, and... no crashes. If
> anything is amiss, my runtime shows a messagebox with the Lua
> alert, including the stack trace. Once you have reached this
> point, development goes very quickly. And it's a very
> pleasant way to work as well. I cannot help but smile when I
> think back to those heady days when I did all my front-end
> work in C++... Trust me when I say: I am never going back!
>
> > I really like being able to write 10-20 lines of Lua
> instead of 50-100
> > lines of C++. I just want to make sure I don't run out of
> CPU cycles too
> > quickly.
>
> Well, I think you're test/benchmark approach is sound. But,
> if possible, just try out a few actual scenario's to get a
> really good feel for what's happening. The couple of lines of
> Lua versus the countless lines of C++ is definitely something
> that can be reached. I know, because I have.
>
> Life with Lua is good!
>
> I hope this helps.
>
> Ashwin.
> --
> no signature is a signature
>