As I continue to daydream about this -- I'm becoming convinced that, if we did allow a _PRECISION upvalue, we'd want the current type of a number to be almost entirely opaque. Under "float64", 2*3 would yield 6.0, while with "mixed64", 2*3 would be 6. And this is undeniably strange. Ideally, the difference between 6.0 and and 6 should only matter for the purposes of overflow handling, but that ideal is easily broken, as people can use a subtype inspection function to write code that behaves differently when given 6 rather than 6.0.
As Roberto has said, a subtype inspection function probably ought to be part of the debug library, but, standard Lua probably only needs functions that force values to one subtype or another. And, in fact, if we expand the range of _PRECISION values, allowing "int64" and "uint64", I don't think the language would even need subtype conversion functions. If you're working with file offsets, you just set _PRECISION="int64" -- I don't think the internal representation of the number shouldn't matter until you operate on it, and at that point, all you need to know is what to type to cast it to.
This would also imply that we wouldn't need a special integer division operation -- '//' would simply be what happens when you divide numbers under "int64" precision settings.
-Sven