[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Re: Lua: compilation failure on TinyCC, probably cause: Lua non compliant usage of sizeof
- From: Lorenzo Donati <lorenzodonatibz@...>
- Date: Fri, 22 May 2020 00:41:43 +0200
On 21/05/2020 23:08, Massimo Sala wrote:
Hi
At the moment I am testing Lua with two compilers: gcc and TinyCC (
https://bellard.org/tcc/)
tcc bails out with "error: function pointer expected" on this line
#if defined(PTRDIFF_MAX) && (PTRDIFF_MAX < MAX_SIZE)
It tooks me a while to catch the root cause. In Lua file llimits.h
there is
#define MAX_SIZE (sizeof(size_t) < sizeof(lua_Integer) ? MAX_SIZET :
(size_t)(LUA_MAXINTEGER))
It seems to me C99 doesn't require support for "sizeof" in
preprocessor conditionals.
It seems you are right, but I think the problem is not "sizeof", but the
casts (during preprocessing sizeof is just an identifier, not an operator).
C99 draft standard N1256, in section "6.10.1 Conditional inclusion"
under "constraints" recites:
"The expression that controls conditional inclusion shall be an integer
constant expression except that: it shall not contain a cast;
identifiers (including those lexically identical to
keywords) are interpreted as described below;"
and then, under "semantics":
[...]
4.
"Prior to evaluation, macro invocations in the list of preprocessing
tokens that will become the controlling constant expression are replaced
(except for those macro names modified by the defined unary operator),
just as in normal text. If the token defined is generated as a result of
this replacement process or use of the defined unary operator does not
match one of the two specified forms prior to macro replacement, the
behavior is undefined. After all replacements due to macro expansion and
the defined unary operator have been performed, all remaining
identifiers (including those lexically identical to keywords) are
replaced with the pp-number 0, and then each preprocessing token is
converted into a token. The resulting tokens compose the controlling
constant expression which is evaluated according to the rules of 6.6.
For the purposes of this token conversion and evaluation, all signed
integer types and all unsigned integer types act as if they have the
same representation as, respectively, the types intmax_t and
uintmax_t defined in the header <stdint.h>. [...]"
Hence:
PTRDIFF_MAX < MAX_SIZE
is expanded to:
PTRDIFF_MAX < (sizeof(size_t) < sizeof(lua_Integer) ? MAX_SIZET :
(size_t)(LUA_MAXINTEGER))
which will be expanded again until all macro names are gone; in the end
the expression will still contain a cast which violates the constraints
section, therefore the compiler is forced to issue a diagnostics.
Still, the message is funny, since no function pointer type seems to be
involved, but only integer types.
Probably the expansion lead to something like:
N1 < (0 (unsigned long) < 0 (unsigned long) ? N2 : (unsigned long)(N3))
Where N1, N2 and N3 are integer constants and those zeros come out of
the expansion of "sizeof" tokens.
As a test, I just tried to compile a .c file with this directive using
GCC 9.2.0:
#if 1111 < (sizeof (unsigned long) < sizeof (unsigned long) ? 2222 :
(unsigned long)(3333))
and this:
#if 1111 < (0 (unsigned long) < 0 (unsigned long) ? 2222 : (unsigned
long)(3333))
And in moth cases it barfed with the following error:
error: missing binary operator before token "("
I hope my analysis makes sense :-)
BTW, tcc compiles fine if I replace that line with
#define MAX_SIZE LUA_MAXINTEGER
Best regards, M
Cheers!
-- Lorenzo