|
You’re almost right! But, I avoided to use the word decimal, I only
used the word digit.
(BTW ‘always’ was in reference to ‘writing’ so I hope no one else will pop
up saying that before that there was no concept of writing, and before that, no
concept of numbers.)
Although your intent was a bit different, in fact you’re helping my case
become clearer.
A hexadecimal number is made up of digits 0 through 9, and A through
F. We call them collectively hex digits, don’t we? We do not call
0-9 hex digits, and A-F hex letters. (Although I wouldn’t be surprised if
some did.)
The same with Roman numerals, or even the ancient Greek numbering system
(this was decimal BTW), which borrowed digits from the alphabet (until the so
called ‘Arabic’ system became the norm).
But these ‘alphabetic numeral’ strings that (obviously) denoted numbers
were pronounced as numbers, not as the sounds of the letters that composed
them. Even to this day, the ancient Greek numbering system survives to
some extent, and with the exception of ‘illiterates’, they are pronounced as
numbers. For example, IB’ is pronounced 12th, not ‘iv’.
Regarding the length of a number being different depending on what
numbering system is used, this is correct and expected behavior. Would you
want the length of hex ABCD to be anything other than four? Or, the equivalent
decimal 43981’s length to be anything other than five? I don’t think
so.
To help you understand my point better, think of a very simple and common
application. If you want to center a number within some fixed length (say
a terminal screen of fixed width), you need the length of the number (in
whatever representation that number is to be shown) so you can subtract it from
the total available length, and add half that number of spaces to the front of
the string to center it. Simple?
Regarding the length of irrational numbers (e.g., 1/3) this is obviously
the same as the length of its representation. Now, if you intent to print the
infinite series of numbers then I guess in that case its length would be
infinity. But, do you?
Why do you have to have an absolute number as the length of any given
number? The length is relevant to the representation. A binary ten
is four digits long (1010), a decimal ten is two digits long (10), and a hex ten
is one digit long (A).
But isn’t the same true for language words? Any word’s length is
based on its representation in a given language, or even in the same language
with multiple valid spellings. In English, for example, some say potatoe,
some say potato (regardless of which form you consider correct, that’s
irrelevant.) Same length?
Obviously, for decimal numbers (rational or irrational) the length has to
include the decimal point. It also has to include a possible sign (even if
that sign is an explicit +, but obviously not an implicit +).
In simple terms, the length of a number is how that number would be
represented if you were to print it out. No complicated rules. No
super science. If I can convert the number to a string and get its length,
then that’s the number I’m looking for.
Thanks. From: Axel Kittenberger
Sent: Sunday, November 23, 2014 2:03 PM
To: Lua mailing list
Subject: Re: ## operator (error)
Since you said "always", it certainly was not always a string of
decimal digits. There were 12 and 60 based systems, for those people felt more
natural. The romans had their unique name to see numbers as strings of letters.
For a roman the length of #IX would be II. Substract one and the length of #VIII
would be III. More.
And to go back to decimal system, this reasoning gets very problematic when
entering non integer rational numbers. So the length #(10/3) would be infinite?
Or Fifty-something depending on the floating point arithmetic implementation of
the current processor?
PS: For computers a number is just a fixed size string of binary digits.
With a floating decimal point and some encodings in case of floating
points.
|