[M3devel] build problems with libunicode

Rodney M. Bates rodney_bates at lcwb.coop
Tue Jun 3 02:52:50 CEST 2014



On 06/02/2014 02:07 AM, Elmar Stellnberger wrote:
>
>> The only target system that does not support such a conversion on its own would be
>> Xorg/Trestle. You would need to convert to utf-16 and then byte swap for little endian
>> machines as the XChar2b is defined as a struct with the hi byte first.
>>
>
> However the XChar2b of X11 can only handle 16bit code units with up to 65536 chars
> so that a more simple direct conversion would be appropriate in this case.

If X11's XChar2b is really a code unit, and sometimes two of them can represent a code point,
then UniEncoding.Encoding.UTF16BE in libunicode will take care of this.  If XChar2b is a code
point, and only code points up to 65535 can be handled, then UniEncoding.Encoding.UCS2BE will
do it.  Unless, in X11, the surrogate codes of Unicode are real printing characters and not
surrogates, in which case, libunicode would need a new encoding, like CM3WC, except big
endian.


> I guess we
> could even skip utf-16 support and simply take libunicode as is by fixing its compilation
> errors though utf-16 is is widely in use by Qt, Gtk, Java and C# and proper support for it
> will improve interoperability.
>
>

If you set up the compiler to support Unicode, there is full support of utf-16 in libunicode,
as is.

-- 
Rodney Bates
rodney.m.bates at acm.org



More information about the M3devel mailing list