lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


There are libraries, such as LZO (unfortunately not an easy license for everyone), that can decompress in-place.

I've used it for Ultimate Spiderman on the Playstaiton 2 (PS2), where it was decompressing 128kb compressed chunks into 256kb (max) decompressed buffer on the IOP chip (that chip was actually Playstation 1 but used only for I/O, sound, network). The chip was running at 40Hz, and I was able to get 5mb/s decompressed size speed, which was better than the average CD reading speed (constant angular velocity, or dynamic one).

All I had to do was to patch byte-to-byte writes to be unpacked-32-bit int writes using some MIPS instruction that I've already forgot.

Recently we have tried zlib (and bzip2 is even worse), but such things are very slow on consoles, and might be slow now even for systems with SSD (For us the idea is to bead the read time, so decompression should be faster than reading).

But then again zlib is ubiquitous, and well known. Also security wise seems to be taken care better.

Other alternatives: LZX, LZF, QuickLZ - and by just googling them, you would find many other related. Most of these are just page or two of code.

On 12/5/2011 8:41 AM, Axel Kittenberger wrote:
It will double the memory footprint however. Unless you unmap the
compressed source after you uncompressed it. If declared constant an
in its own memory page the Linux kernel will unload it after a while,
when not used and it needs memory. (since it can always reload it
again from disc binary)

On Mon, Dec 5, 2011 at 5:35 PM, Patrick Rapin<toupie300@gmail.com>  wrote:
  Are there other tools?

For all my Lua applications, I compress embedded sources using ZLib
(or BZip2 in one of them).
At application start, the code is uncompressed and fit to luaL_loadstring..
This permits to decrease the size of the executable and hide
(slightly) the source code.