lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]

Sam Roberts wrote:
> 	% cat
> 	extern void x(char*);
> 	void test(int i) {
> 			char* p = "hi";
> 			x("hi");
> 			x(i ? "hi" : "hi");
> 			x(i ? p : "hi");
> 	}
> 	% g++ -Wall -c
> In function ‘void test(int)’:
> error: invalid conversion from ‘const char*’ to ‘char*’
> error:   initializing argument 1 of ‘void x(char*)’
> Literal strings being promoted to const char*, I can accept that, but
> why would (i ? "hi" : "hi") be char*, but (i ? p : "hi") suddenly becomes
> const char*, when the only thing new is a non-const ptr p?

Because, IIRC, in order for the compiler to figure out the type of (b ? X :
Y), where X and Y are different types, it has to come up with a unified type
which X and Y can both be implicitly cast to. Since the type of a literal
string is usually[1] const char*, and since a char* can be implicitly cast to
a const char*, then the only type that fits is a const char*. However, it then
can't implicitly cast the const char* back into a char* in order to be used as
an argument for x(), so you get an error message.

You ought to get much the same sort of behaviour with any two types which
unify oddly. So:

typeof(true ? 0 : 1) == int
typeof(true ? 0L : 1) == long


And yes, the rules *are* different in C++. It's worth remembering that C++ is
not a set of extensions to C; it's a different language that's *partially*
compatible with C...

[1] Notice that 'usually'? This is the reason why x("hi") works. In an
incredibly brain-damaged fit of, well, something, the C++ designers decided
that it would be convenient if constant strings could be implicitly cast to
char*. But this only happens if the constant string is *directly* cast to a
char*. If you do an operation on it first, the const char* type is used
instead. So:

x("hi"); // works.
x("hi"+1); // fails!

C++ also implicitly casts the literal 0 to any pointer type in much the same way.

> Btw, I've never really understood the lua idiom of doing a "return
> luaL_error();", it just obfuscates the fact that the function is not returning,
> IMO.  Is it a hold-over from an ancient lua that didn't use longjmp()?

This is probably because the compiler needs to be told that statements after
that line are unreachable, otherwise it's reachable-code algorithms will get
confused and you'll get spurious warnings. gcc allows you to mark a function
as not returning with __attribute__ ((__noreturn__)), but that's hideously

Besides, can luaL_error() fail?

┌── ─── ───────────────────
│ "Parents let children ride bicycles on the street. But parents do not
│ allow children to hear vulgar words. Therefore we can deduce that cursing
│ is more dangerous than being hit by a car." --- Scott Adams

Attachment: signature.asc
Description: OpenPGP digital signature