lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


I've had a sudden burst of inspiration to design an _expression_-oriented language. I had never looked into Lua prior to this, but after doing some research I came across it and subsequently have fallen in love. Most programming languages are designed by engineers, but a language is essentially a UI and should be designed as such-- Lua feels like a very clever convergence between the two. I have a few questions/assertions to make about the language to see if I understand it's design. I don't have experience with language or compiler design, so some of these questions are general. Let me first note that I haven't coded anything in the language yet- which in many ways is rude of me, but I'm operating under a compulsion to understand and discuss it.

I've read that a Lua program IS a table. This would suggest that it would be trivially simple to save the state of a lua program or subroutine by writing out said table- which has a lot of intrinsically awesome properties if my assumption is correct. How does this work, conceptually, in relation to the program as a running procedure? Are we just iterating through a table of expressions or is there a special way in which we are stepping through the program? It seems like we're stepping through a table of expressions and throwing things onto a stack until we have an evaluation which may or may not bubble up in scope to modify values of some other tables. Is it as LISP-like as it seems?

It seems like all data is or could be perfectly well-managed. Why is there a need for a GC? When we leave scope all local variables get freed and whenever a table is nil'ed we're implicitly freeing that memory. I don't understand GC's very well-- in C, we malloc and free memory whenever we need to. It seems like Lua's routine for running a program could implicitly determine when and how much memory ever needs to be allocated/freed based upon nil assignments, scope, and the evaluation stack. To clarify-- Suppose we define a table and assign it to a variable. Does Lua put that table into a memory and have the variable point to it ala conventional paradigms? My LISP senses tell me that this shouldn't be the case. Since the program is a table, it seems like Lua would step over allocating memory until the value was necessary to evaluate some other _expression_. Instead of putting the table in memory, we can just have the variable point to the location of it's defining _expression_ in the table. In this sense, it seems like we should just be caching and pointing to expressions until they're needed on an evaluation stack. Is that kind of how Lua works? If so, couldn't Lua implicitly manage memory without explicit garbage collection?

This leads me to another point of conjecture in regards to memory management. This is more of a general question about language design. How are static values stored in memory? Suppose I have an _expression_,
X = 10
In C, we have a zillion different integer types to wrap memory allocation expectations into the definition. That all makes sense, but in the above _expression_, we don't need more than 4-bits. In C, we can malloc 4 bits of space, assign it to a variable, and just treat it like an integer- afaik, the type declarations in plain old C are not much more than syntactic sugar. However, suppose we have-
X, Y = 10, 10
Y = X + Y
X and Y, in my mind, should both initially point to the static value 4-bit value of 10. Then they're put on an evaluation stack, the result is cached, and then Y points to a new static 5-bit value of 20. Now let's do,
X = 4
Well, we already have a static representation of 4 in memory if we point to the 3rd bit of the 5-bit representation of 20. It seems like we shouldn't have to allocate any more memory, but rather push the new static value through a bitmask of already existing static values- if it exists, we return the memory address, if it doesn't, we allocate. Is something like this just too slow? Something about it seems more appealing than naively assigning a bunch of memory to a variable that's never going to use it all. Since the _expression_ evaluates it before assigning it to Y (in the Y = X+Y example), is overwriting the memory that Y points to any more efficient than allocating new memory and freeing the old? This is something I never quite understood about simple data types and memory management. The issue with arrays in C is obvious, but if I'm tossing around integers, why can't we allocate based on the stack?

It seems like all data in Lua are tables. That there aren't actually any simple data types- but rather syntactic sugar to make them feel simple. IE a number "Type" is cloned or generated from an _expression_ that outlines the metatable necessary for a number to function in the way we expect. Is this how Lua does it? If it is, very cool- I can see how that would streamline the engineering side of the language. Do all of Lua's primitives function in this manner? Or am I just completely off on this assertion? With the above memory concerns, it seems like Lua could easily manage memory without a GC, implicitly calling malloc/free in appropriate places- or is this what it is essentially doing?

My last bit of curiosity deals with syntax, metatables, and the Lua standard functions. I feel like the way in which we build a metatable is logically sound, but syntactically convoluted. The built-in helper functions feel very unpythonic er un-lua-like. Special cases with a lot of extra verbosity seems to work against the simplicity- especially with functions. If a function is a table of expressions, as I suspect, than the parameter is just a key (all arguments are passed as an immutable, no?) and a function has a metatable for __index or __call. This doesn't seem consistent with the Lua paradigm (though I know I don't quite get it). I'll try and explain why.
aFunc(10) = 4
A function could be described as mapping of parameters to return values. It seems like we should be able to overwrite potential keys or parameters for the function table in the above manner. I can already see how trivially easy it is to implement memoization, so applying this idea is simple, but it seems like it should be the normal way it works. In which case there shouldn't be a function data type at all. This is where I get confused- on the one hand, everything is a table, but then there are special rules for functions, but tables also have those rules in the metatable with __call, but that isn't intrinsically different from __index- so... why the redundancy? Does this mean that userdata is also special since only __gc works for it? Or is userData just given a special flag so that the GC knows to call the __gc meta-method?

Lua has the potential to work as a probabilistic knowledge base as the language is almost expressive enough to write working FOL expressions without any special functionality. This is very elegant, but the syntax makes me feel like it wouldn't work this way. If I write something like,
isJoyful(X) = true
Without defining isJoyful or X, will Lua interpret this properly? It seems like it should- though I know the __call meta-table won't like this.
isJoyful[X] = true
What about this? Is it stupid of me to think that Lua should realize that it is table-referencing syntax and just make one if it doesn't exist, and then see that the parameter is undefined and create a pointer X that doesn't point to anything yet? If all types are tables, does Lua interpret undefined variables as pointers to some table (in regards to type)? If not- why not? Lastly, when we pass a pointer to a table as a key, does the value of the table become the key or does the address of the table?

This gets me thinking about some other things. Are there more syntactically intuitive ways to express the idea of the meta-table? The type system, GC, and meta-table system all feel like they're patchwork (in syntax!) solutions to make Lua appear C-like. It seems like there could be some prettier ways to solve these problems or express these ideas. For example- if we remove the idea of functions as a special type and just consider all types are tables, then  __index essentially lets us overwrite the default return value of nil when a key doesn't exist. The property is analogous to calling a function with a parameter. Ie, if we could define a table whose nil value is essentially replaced with 4,
aTable = 4
Since all parameters are essentially packed into a table, there isn't anything special about,
function aTable(a,b,c) return c
The three parameters go in as the table represented by {a,b,c}, and are unpacked into variables with those names. This is ultimately just sugar, right? Since __index has access to the key passed (right? if it doesn't I will shoot myself right now), how is this different from __call? I mean, syntactically, why should they ever be distinct? I can see how they could be used for different ideas, but I'd rather not be fighting parenthesis and brackets for similar ideas in the same table (or are we really just overloading the () and [] operators?). If I assign an _expression_ to a variable, then, that variable, which is a table, defaults to that _expression_ on any key/parameter passed, including nil/none. Wouldn't something like this grammatically or syntactically unify the meaning of all types clearly and nicely into the table paradigm? Clearly,
aTable = 4
Should just be sugar for a number-style table declaration, but you could do something like,
aTable[] = 4
aTable[nil] = 4
Where 4 can of course be an _expression_. In this way we could create default expressions that govern what sorts of data a table will accept, how it interprets it, possibly caches the evaluation, and then returns the evaluation. In other words, a table could initially be assigned as either an _expression_ of legal values or a literal set of values. The former is just a function whose return value can be mapped/overridden with an explicit assignment (aTable[4] = "roar"), the latter is just a functional enumeration of values. Additionally, you could use an existing table as a meta-prototype/archetype/type with some more sugar,
aTable someTable; --or
someTable isa aTable
I have a preference toward languages that feel more natural, and Lua, in a way, let's you create new grammar in the way that you define your tables. Has anybody messed with genetic programming in Lua? It seems like it could be possible with a unified type system.

So- I don't know, it seems like with nil assignment and scoping, and the powerful ways in which tables work, memory management should never need a formal GC. Obviously I am wrong on this, but I would like to know why. From a UI standpoint, I feel like the way in which the metaTables are exposed and the way in which the type system works fight against accessibility to the language. With that in mind, Lua is easily the most accessible language I've seen.

Thanks a ton for even reading this mess, I hope I don't horribly annoy or offend anyone with my meager attempt to understand this beautiful and clever language.

~Marco