[Date Prev][Date Next][Thread Prev][Thread Next]
- Subject: Re: More about packaging (fwd)
- From: Mike Pall <mikelu-0406@...>
- Date: Mon, 7 Jun 2004 16:33:23 +0200
Edgar Toernig wrote:
> > - An extensive framework to support automatic unloading is very difficult
> > to get right and will not benefit most developers.
> Sorry? 2-5 lines of code in the Lua core plus the dlclose handling in
> the loader library. That's all!
Ok, after thinking about this thoroughly, you convinced me. The overhead
is negligible and the benefit is added orthogonality. I can see the value
in this, even though the average app won't need it right now. But since
dynamic frameworks (aka 'pluggable' apps) are getting more and more common,
it will pay back sooner or later.
The real question is, how to get this into the core (or else we can forget
about it). Reading the 3/2002 lua-l thread looks like it was not adopted
back then? :-/
Ok, back to the original topic:
I like your way for unifying static and dynamic C libraries. Comparable
approaches (Linux kernel modules, Python loadable modules ...) have proven
their value in practice, so please let's do it! I have been bitten by API
incompatibilities with plugins in other projects, so I can appreciate the
value of stringent and safe conformance checks on loadable modules.
And about the other open issues:
- C vs. Lua libraries: Diego seems to lean towards providing two different
functions for loading them. Edgar would like to integrate both into
My opinion: the importing code should NOT need to know what kind of
library it is loading. I'm all in favour of having one and ONLY ONE
function to do all of this (whatever it is named or wherever the code
for it is located).
- Namespaces: I think there is a consensus that the inconsistencies in the
current model (globals, _LOADED, return from require) should be removed
(static vs. dynamic C libraries vs. Lua libraries all behave different).
Diego's proposal (local foo = require "foo" and not setting globals) has
some merit, because it requires consistency from the coder. Otherwise
you tend to forget to import a library and just because another library
that was loaded earlier imported it, you get away with it (until someone
changes the load order ...).
Edgar is right that a 'set globals only' model simplifies many things.
Libraries offering multiple namespaces have no straightforward equivalent
in the other model.
My opinion: maybe we can get the benefits of both:
- Imported libraries set one or more globals, the primary name being one
of them. The return value does not need to be cached (i.e. _LOADED
could be dropped except for backwards compatibility).
- require(name) checks whether _G[name] is set and either just returns
this value or loads the library and then returns _G[name].
- Lazy coders can use 'require "foo"' and just use the provided globals.
Some coders may prefer to add a bunch of calls to require during
initialization and then do not need to worry about it anymore.
- Less lazy coders can use 'local foo = require "foo"' and get the
speed benefit of GETUPVAL vs. GETGLOBAL.
- Coders that want a clean namespace and stringent checks can use a
special function at the top of their files (probably called 'module'
or 'package'). This function sets up a new environment table for the
caller, populated only with the require function. This forces the
use of 'local foo = require "foo"' for all libraries (even for "io",
"string" and so on).
- So everyone is happy and it even mixes well because the choice is upon
the coder of the importing file (and not the coder of the imported
library or the language used or the linking model).
Oh and I'm not picky about the name of the one-grand-unified import
function (I choose 'require' in the examples -- 'import' may be
another popular choice).