From a human point of view numbers are (have always been) nothing more than
strings of digits.
Since you said "always", it certainly was not always a string of decimal digits. There were 12 and 60 based systems, for those people felt more natural. The romans had their unique name to see numbers as strings of letters. For a roman the length of #IX would be II. Substract one and the length of #VIII would be III. More.
And to go back to decimal system, this reasoning gets very problematic when entering non integer rational numbers. So the length #(10/3) would be infinite? Or Fifty-something depending on the floating point arithmetic implementation of the current processor?
PS: For computers a number is just a fixed size string of binary digits. With a floating decimal point and some encodings in case of floating points.