lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]

> In our game we managed to reduce fragmetation to 0%.

By your own admission, the fragmentation is not 0%! :  ''we "waste"
200/300k of memory every 100Mb allocated.'' :-)

> What we do is managing small allocations(smaller than 513 
> bytes) and big allocations in a different way.
> for every allocation size between 1 and 512 bytes we allocate 
> a 4K page where we will store only object of a certain size;
> for example 1 page with only 4 bytes chunks 1 page with only 
> 32 bytes page.we round the allocations size to multiple of 4.
> for all chunk bigger than 512 bytes, we allocate it with a 
> normal general pourpose allocator using a "best fit" algorithm.
> we "waste" 200/300k of memory every 100Mb allocated.
> With this solution we have an allocator that is almost CPU 
> free for small alloc/free and relatively fast for big alloc/free
> because the number of chunk managed by the "best fit" are not so many.

The schema you suggest would actually enforce fragmentation due to
unfilled 4k pages. But, such a recursive allocator is good way of
localising fragmentation with the benefits of fast pool allocation. I
assume each pool size would have an most recently used stack, only
allocating a new pool once you've checked the existing pools for space?