lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


Mike Pall wrote:

If you don't care much about loading time then this might work:
- Compress your (compiled) data files with gzip or bzip2.
- Write a chunkreader that autodetects a compressed file and uncompresses
 on-the-fly. The zlib/libbz2 manuals should have an example that does
 exactly that.
- Hook the chunkreader into your own dofile().
I don't consider file size itself a big problem, it's the fact that bigger files take longer to load from disk that bugs me. So this solution could actually help reduce loading time since, say halving the resource file size through compression should save more loading time than the decompression overhead it would introduce. It all depends on how much this data will compress of course. I think that since most part of say an image file would be IEEE floats in the range of 0-255 (wich means that a part of the mantissa and the whole exponent of all floats should be the same) the file should have low enough 'entropy' to compress well. I tried to compress two model lua files one using strings (essentially zero overhead, like a normal binary file) and one using tables of lua numbers (doubles). Uncompressed the files are 16.6k and 58.12k respectively. Using gzip compression reduces the size to 9.4k and 19.2k while bzip2 compression results in 10.2k and 18.8k respectively. I consider this rather encouraging and I'm definately going to do it since it can be integrated so cleanly in dofile. Still saving space in the lua bytecode itself could lead to even smaller files...

Dimtiris