lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


> (…) Well, having deterministic finalization is really cool.

I work on Mac & iOS applications and I use RC everyday and I do appreciate RC and release pools too. They give full control over memory allocation (and deallocation of course).

Indeed RC requires more care than GC, however Cocoa has nice I clean rules about that, you are responsible for objects returned from +[alloc] and -[copy…] methods or retained via -[retain]. I've just read some news on MacStories that iOS apps crash more often than Android ones, and I believe this may be true. But iOS apps crash, when Android silently leak memory and get sluggish. IMHO as developer I prefer my app to crash, as I can easier spot the problem. (BTW. There is also automatic reference counting ARC in latest iOS)

However I have absolutely no experience about real-time (games) using RC counting for short lifetime objects. I was always doing this in C/C++ with manual memory management or stack.

I can just presume regular GC and RC are ok for object with longer lifetime, but both are no good for small temporary objects, such as vectors & matrices that in my case exist only within current scope to provide some temporary buffer for doing some calculation then disappear after being sent to GPU. That's why I think GCC & Clang compilers go further providing vector extensions that treat vectors as primitive types kept on stack. Otherwise CPU would waste most of the time managing memory, copying registers and moving data, and least doing actual calculation.

Reading Mike plans and http://codespeak.net/svn/pypy/extradoc/talk/pepm2011/bolz-allocation-removal.pdf seems that allocation sinking is way to go for vector/matrix handling, while I think rest long lifetime complex objects will continue work fine with regular GC.

Regards,
-- 
Adam