It was thus said that the Great Philippe Verdy once stated:
> One "bad" thing about Lua, is that the value range and precision of Lua
> "numbers" is not strictly defined (this is the same problem as for "int",
> "flaot" and "double" datatypes in C/C++, which are implementation
> dependant: that's why POSIC defined a standard library of common
> constrained numeric types), so there's a portability problem. Such thing
> still varability about "strict" computing modes and rounding modes which
> can be tweaked locally, by some hints, which are not always mandatory...
> except in Java and Javascrip,t where "strict" numeric evaluation is
> enforcable by the language itself).
where it says this? I can accept it for Java, but I'm dubious about
operations (like Lua 5.2 and earlier).
> Lua also does not formally define the
> allowed values for infinites, Nans, denormals, or sign-ness of zeroes
Um ... what? IEEE-754 (which is what most modern floating point is based
on) does define infinities (two, positive and negative), NaNs (only to say
there are two types---quiet and signaling; it says nothing else about then
except that NaN != NaN), supports denormals and gives two zeros (0 and -0).
If you mean for Lua to state "must support IEEE-754" then say so. But Lua
is based upon what C gives, and what C gives is not IEEE-754 (or at least,
does not mandate its use).
You're repeating exactly what I said: this is not defined formally in Lua, but by the implementation (for *example*, what C gives, but C is not a requirement of the language for implementing it).
> I would really suggest that Lua is extended to define a standard library of
> strictly defined numeric types (strict value ranges and precision, strict
> evaluation, rounding modes) to allow creating portable programs (even if
> this is at the price of speed). Lua for now is (like C/C++) tweaked in
> favor of performance, not necessarily portability (and semantics): each
> implementation then defines its own instance of the language, they are not
> really /stricto sensu/ the same language if each instance has its own
> limits. Unfortunately there's no simple way for a Lua program to safely
> detect these limits.
I have not noticed any differences between PUC Lua (5.1) and LuaJIT. I
have also not noticed any differences between x86-32 bit version of PUC Lua,
x86-64 bit version of PUC Lua, SPARC 32-bit version of PUC Lua, and SPARC
64-bit version of PUC Lua. All have behaved (and current, are behaving) the
That's not true, you give examples where Lua is used as a standalone application or embedded using the C library. There are other use cases where the "ABI" in C is not used at all and Lua is integrated with something else.
> This has a consequence: it's not so simple to emulate in Lua an other
> WebAssembly which allows runninig a full Linux OS inside a 100% pure
If WebAssembly can run Linux, then it certainly can run Lua, since it's
written in C, much like Linux.
You consider the exact reverse problem: yes, WebAssembly can certinaly run Lua, but I doubt that Lua can safely (and portably) run WebAssembly this is what I wrote) without breaking, except with specific implemantations of Lua.
> you need to specify the supported platforms (i.e.
> specific instances of Lua, i.e. the set of supported Lua VMs).
> I bet that Lua will later defined such standard library, at which time it
> languages like Python, PHP, C/C++ and various kinds of "emulators") and it
> will easier to port programs so that they'll run unchanged on other
> architectures (making use of GPUs/APUs/VPUs or working over a computing
> grid over a network with Lua objects distributed in various hosts, but Lua
> will need other concepts, notably "security contexts" like in Java, or in
I'm not sure what you mean. I have myself written a computer emulator in
both C and Lua  and both work the same .
I've also written several emulators, interpretors, compilers since many years.
My first complete one was done in 1989, then extended in a period up to 2000; and it was really used in actual production and sold, it was for a data reporting tool, then for a printer emulator, then a Postscript emulator engine, then ported as a printer driver embedded in the application, then it was used to emulate terminal protocols as well, with fully customizable appearances driven by user scripts which were desigend by users, compiled, stored and reused to generate various reports on very voluminous data sets (also customizable coming from multiple RDBMS systems or database file formats or live datasources, with data queries/filters integrated as well in the user scripts; the application of these were for example used in billing, accounting, financial analysis, commercial analysis, ressource management, statistic analysis, scientific data agregation, measurements). Note only the languages developed was for processing of incoming data flows, but also for adaptation of results to various kinds of outputs (not just displays, also printers, or databases, to be used as a new data source, that could be queried again). The language was then entirely dedicated to data transforms. It was very fast and avoided writing many specific programs, and was ported on many systems (mainframes, MVS, VMS, various flavors of Unix, and even on old versions of Windows, it was also running in MSDOS or similar, notably in industrial environements and later rapidly on NT and OS/2.) It did not require any multiprocessing OS (it was written in C only, not any piece of C++ and care was taken to make it as portable as possible, using basic POSIX capabilities and if needed emulating them when they were missing in the native OS) it was integrable as well in several RDBMS (e.g. via Oracle PL/SQL, or Sybase/MSSQL stored procedures, or Informix procedures, and much later on MySQL) and even emulated several SQL variants (rewriting SQL queries when needed).
The language also had a visual programming interface (not requiring users to write code: the visual designer generated the code itself from a schema drawn like in Powerpoint, with boxes, arrows which were freeely movable with a mouse on a blank design sheet; however I did not develop myself this visual interface, but adapted the language to cover some of its needs by adding some syntaxic features). The whole tool contained several language parsers and generators some dedicated to process input, some to describe these inputs and filter them, other dedicated to process outputs, filter them or give them a presentation, but they were all based on a set of declarations (avoiding imperative ordered instructions) focusing specific parts of the input dataset, evaluating conditions triggering an execution, and then acting as filters to produce one or more output. The "ordered instructions" were voluntarily very limited (there was no need to define any branch or control of execution, not even a single loop). This was why it was possible to create a visual designer where time/sequencing was irrrelevant but all was descriptive and modifiable locally. It was the tool that finally determined itself the sequencing of actions and produced most of the transforms needed, evaluating itself all the described conditions.
The complete tool also included a customizable scheduler (based on a database scheduler and/or a native OS scheduler). My colleagues used it to design a facet-oriented OO language, I helped them to define a compiler for it, which could be compiled to native C++ standalone applications working on the X11 environment and later on Windows (at that time there was still no standard graphic environment with features-rich sets of visual components for the UI).
No I've not said your last statement. Here also you consider the inverse problem. So this is only what said "the Great Sean Conner"...