[Date Prev][Date Next][Thread Prev][Thread Next]
- Subject: Re: What's up with token filters (Re: New operators?)
- From: "Jérôme VUARAND" <jerome.vuarand@...>
- Date: Sat, 14 Apr 2007 04:33:04 -0400
2007/4/13, Rici Lake <firstname.lastname@example.org>:
But these are not Lua, and they shouldn't pretend to be. If one wants
a language to grow a large and useful array of third-party compatible
libraries -- and I believe that you share that goal for Lua -- then
it is vital that there be a standard Lua in which you can write those
libraries. If you are using a source-transformation system to code
those libraries in, then by all means provide the pre-transformed
code as well, but make sure you distribute something which can just
be require()'d by a casual Lua user.
Lua is not only a language, it's also one of the best virtual machine:
very portable, well tested and well supported. As such it's very
powerful, but still very fast. Not all vm can support things like
closures, tail calls, coroutines.
You can view Lua as a combination of two products: a backend, the vm,
and a frontend, the language (with the associated compiler targeting
that vm). You have two levels of interaction with that couple: you can
provide source code to the compiler, or you can provide the bytecode
directly to the vm.
Both have the same possibilities, and different constraints. Source
form is very tolerant to low level modification of the vm, like
endianness, number format, memory management. The bytecode is very
tolerant to high level modifications, like custom semantics or syntax
changes through token filters.
Many fear that token filter will introduce a lot of variants of Lua,
but there are already a lot of Lua variants. Token filters will
introduce variation of the frontend part of Lua, but backend part is
already creating incompatible variations (non-portable bytecode
between hardware architecture is a symptom of that).
So adding token filters wouldn't turn a standard language into a
multitude of language. It would turn an existing multitude of
languages into an even bigger multitude of languages. While not being
easy, I believe it's possible to make token filters (or any equivalent
solution) universal (for example by creating a portable bytecode
format, and changing the default distribution form of Lua programs
from source to bytecode). Just like the module system can *completely*
change the semantics of the language, another equivalent mechanism
could do the same to the syntax.
Finally I'd like to mention that this has very large implications. Lua
may become much more than a scripting language. For the moment, when a
software project needs a scripting language, someone looks at what
exists, and either pick one or create a new one. The reason for
creating a new scripting language is to have customized syntax and
functionnality. Lua has already proven being able to handle any
functionnality. Customizable syntax would make Lua the perfect
replacement for *any* custom or domain specific language. It would
make Lua the default solution to many programming problems.
(Sorry for being that verbose, but I think the question is important
and everything needs to be said (multiple times if necessary) before
any move is decided.)