lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


int a, b, c;
a = 10000;
b = 5000;
c = a * b;

This will generate different results on different platforms, depending on the width of int. For those of us who started when CPUs were 16 bits, you could NOT use code like this without overflow. In fact, all those silly dialog boxes with impossible numbers ("The copy will complete in -150009999 seconds") are mostly traceable to invalid assumptions about things like this that are most certainly standard C and yet did not port well.

I don't think at the runtime argument is odd. You stated that writing to standard C would make code portable. I merely said that while standard C was a requirement, it was not the ONLY requirement.

Also, C is NOT defined in terms of a VM (real or virtual), unlike Java, which is why there is far more variability in C than Java. Java has its faults (legion), but in general Java code (and, to a lesser extent, C#) is more portable than C for this reason. Spend any time looking at the Lua header files to see what incredible lengths Roberto's team had to jump through to make Lua as portable as it is, they certainly did NOT just "write it in ANSI C".

--Tim

On Apr 30, 2013, at 7:47 PM, William Ahern <william@25thandClement.com> wrote:

> On Tue, Apr 30, 2013 at 06:51:23PM -0700, Tim Hill wrote:
>> 
>> But the point is that the C "standard" leaves some things so open that
>> it's very difficult to write code without making some guesses. Width of an
>> integer?
> 
> If you could provide concrete examples, then it would be easier to discuss.
> 
> 
>> Behavior of C runtime functions? All have very subtle hidden assumptions.
>> For example, when you call "malloc()" you assume that it won't take 5
>> seconds to allocate memory, but the standard is silent on such issues as
>> API overhead. Writing in standard C is a REQUIREMENT for portability, but
>> not a GUARANTEE.
> 
> That's an odd argument to make. No general purpose language makes that
> guarantee. Runtime limits are nearly impossible to standardize, although C
> makes a passable attempt. Anyhow, on systems with pageable virtual memory,
> even a block of memory already allocated can stall the whole execution
> context. In any event C, just like Java or C#, is defined in terms of an
> abstract "virtual" machine.