lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


>I was just wondering whether anyone knows or has tried compiling lus
>script to byte code on a windows or linux platform, where the ints are
>size 32-bits, and then trying to execute this compiled lua chunk on a
>lua engine where the ints are natively 16-bit.

Sorry, it can't be done. To quote the luac man page:

  The binary files created by luac are portable to all architectures
  with the same word size. This means that binary files created on a
  32-bit platform (such as Intel) can be read without change in another
  32-bit platform (such as Sparc), even if the byte order (``endianness'')
  is different. On the other hand, binary files created on a 16-bit platform
  cannot be read in a 32-bit platform, nor vice-versa.

More precisely, the following sizes must match: int, size_t, lua_Number,
Instruction, and number of bits in each operand. See LoadHeader in lundump.c.

On the other hand, if you really must do this, then you can change LoadInt in
lundump.c to read 32-bit integers and truncate them to 16-bits, as long as
lua_Number and Instruction still match. You'll also need to change LoadSize
and LoadLines (which currently reads a vector of ints in one go). Those are
simple changes if you know the endianness of both platforms.

Another possibility is to modify ldump.c to generate code for a 16-bit
platform; the changes are similar to the ones described above but perhaps
are simpler to get right.

Perhaps having such a modified ldump.c would be a good thing for people in
embedded 16-bit systems. If there's demand, perhaps we can provide official
support for this.
--lhf