lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


Peter Shook wrote:
Andrew Teirney wrote:

I was just wondering whether anyone knows or has tried compiling lus script to byte code on a windows or linux platform, where the ints are size 32-bits, and then trying to execute this compiled lua chunk on a lua engine where the ints are natively 16-bit.

If you run luac on a 32-bit platform, it's output is guaranteed to run on any other 32-bit platform. Lua can handle the endian switch. But the output will not run on a 16-bit platform. You'll have to somehow "fix" luac to output 16-bit stuff.

Would this problem be cured by making llimit.h's Instruction type's
size agree on all platforms?  Or if it isn't the Instruction type,
what specifically is it about the resulting bytecode that causes
the compiler and the interpreter to have to run with the same
system integer size?  (And could this be patched-up easily at
dostring()-time when the string is a bytecode chunk, since lua
doesn't run code 'in-place' anyway?)

I had plans to deploy the same bytecode archives on 32-bit
and 64-bit platforms, so this potential interoperability
problems now worries me a bit.

Regards,
--Adam
--
Adam D. Moss   . ,,^^   adam@gimp.org   http://www.foxbox.org/   co:3
Bereaved relatives are not amused
As on their dear departed I feverishly consume