I didn't create it, and I don't have access to it, but one of the LuaConf talks was about stored procedures.
The DB has a "decimal" datatype. Say you have some decimals and you want to put them in a table like so:
local t = {} -- assume that decimal() returns interned values t[decimal("1")] = "c" t[decimal("2")] = "b" t[decimal("3")] = "a"
So now you pass the t to code you don't control:
sort(t)
The code doesn't know about decimals, so when calling sort on it you'd get what you started with. I assume this can happen in a DB, however unlikely that is.
If decimal() returned integers or floats depending on the value, then you could get weird things like:
local part = decimal("10")/decimal("100") -- decimal("0.1") would return a proper decimal print(part+part+part+part+part+part+part+part+part+part == 1.0) --> false
If you were using decimals to store monetary values, this would be very bad.
You can patch the VM to make this work right, and that's what the DB guys did. However, that means they're locked onto a single patched VM version, and upgrading/changing versions means writing a new patch. The __key proposal would instead provide a stableish API so you could switch Lua/VM versions without too much trouble, as most other extensions could be trivially implemented through preprocessing (by replacing `load`) (e.g. turning `x := (expr)`, the syntax for type-converting assignment, into `x = typeconv(x, (expr))`). The only reason this __key thing can't be done in preprocessing is because it would involve turning every single `t[x]` into `t[tokey(x)]` which is (in theory) significantly slower than just checking for __key in the VM.
This isn't a "this would be nice to have" proposal, this is a "I wanna help these people" proposal. They have a usecase for it, and it would be useful for many of us as well. But I really just wanna help these people.
So really this is about creating first-class types in Lua. For example, creating a bigint or decimal type that appears to the end-user to be as integrated into the language as string or integer. I will note in passing that C++ has been trying to do that for 20+ years with only limited success and only by the creation of a language that rivals PL/1 in complexity (note: C++ fans need not flame me).
And of course in Lua the way you do this is with a metatable on the userdata. So your new type can have addition, subtraction etc. And if you supply __lt then sort() will work just as expected.
So why the need for __key? It seems to me, based on your posts, that this is an attempt to borrow the behavior of an existing type. What you seem to be trying to do is say “look, this type XXX is different, but it’s so like an integer (or float or whatever) that for many things you can treat it like one”. So the table *thinks* it’s an integer (in fact, it IS an integer as far as the table is concerned) and you don’t have to write all those tedious metamethods.
But in fact you DO have to write all those tedious metamethods, because a decimal ISNT a float or an integer, and a bigint ISNT an integer either. Trying to make it kinda-sorta look like one is perilous; you are going to lull your users into thinking that it IS one and then they are in trouble. What happens when they store a bigint that is too big for an integer?
Thus this all comes down to two things: (a) writing the necessary metamethods, and (b) using interning to ensure that, for your custom type, if two values X and Y are equal, then they will always have the same userdata. At this point your userdata becomes first-class (or as near as it an be in Lua) and the need for __key evaporates.
(Given the noise level in this discussion, this is all I will have to say on this matter.)
—Tim
|