lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


IMHO it makes sense to apply the above mentioned changes in
loadlib.c. Unconditionally that is, because the *W variants
cannot ever work, given the surrounding code.

What about Win's proposal of undef UNICODE?

-- Roberto

This for sure breaks my code which becomes with:
#ifndef UNICODE
#  error .....
#endif

My observations are that most of the games shipped in 2006
required Windows 2000. Even the applications that run on 9x
line os Windows prefer to use UNICOWS.dll from Microsoft.
This dll converts *W functions to *A calls, adjusts for
bugs in 9x code and calls the OS. In short, the application
works as if running under NT based OS.
This is used by .Net framework code, WinAmp5 and more.
I think that undefining UNICODE in a public header is
impossible at this moment but doable in loadlib.c:

#if defined(UNICODE)
#  undef UNICODE
#endif
#if defined (_UNICODE)
#  undef _UNICODE
#endif
#include <windows.h>

This should be backwards compatible with Win 3.1 API but
I can't verify this.

The main problem with current code is that it is mis-reports
the size of the buffers - the *W functions interpret the size
as number of wide chars but it is in fact the number of bytes.
Not to mention that the returned unicode strings are not usable
by the rest of the Lua code and not suitable for displaying
to the user without further processing.
The compiler complains but only with warnings. And because most
of the multi-platform code is not warning free it can go unnoticed.

But if you can't change the code please consider documenting this
behaviour in the INSTALL file.
Kind regards,
Todor