> Can you share why it was not done?
>> Adding __attribute__((noreturn)) to the one function seems better than> adding forgotten returns to all the invocations. The only thing I am> concerned about is a phrase from the manual:
> > > it is an idiom to use it in C functions as return luaL_error(args).
> > which this code does not follow.
Well, my words might be misleading, I rather understand why I would not do this :)
Note, that twice-underlying "luaG_errormsg" actually carries "l_noret" specifier. The reason why "luaL_error" does not carry it - if you'd want it to work, you'd have to provide compiler-dependent definition of "l_noret" (or similar construct) alongside with "lua.h". And even if you do this, there might always be a compiler who was not invited to the party, in this case "l_noret" will still be ignored and the desired behaviour will not be reached. This is true for both surrounding code and for the platform code that uses "luaL_error". That's why this idiom you mentioned is used - it allows to break data flow on the current control path, just as "__attribute__((noreturn))" does.
Also, note how this idiom is omitted in "lua_error" :)
One possible option is to add something like
> #ifndef LUA_NORET
> #define LUA_NORET void
> LUA_NORET luaL_error(...
and allow the surrounding code to re-define LUA_NORET according to its needs. And if you do this, you'll have to worry about the linker possibly going insane between lua library compiled by gcc and the client code compiled by something else. Because I highly doubt that all compilers handle noreturn-mangled symbols equally. This point also stands for my initial idea of pre-shipping "l_noret" token, if Lua and client are compiled with different compilers.
All in all, this idiom is the most harmless design solution I see so far. So yes, it works for me.
P.S.: Also, creating "l_noret" internal counterpart of luaL_error would solve the initial issue.