<DIV style="font-family:Arial, sans-serif; font-size:10pt;">> From: Elmar Stellnberger <estellnb@elstel.org><BR>> Well, if so I could rewrite some code to define as BITS 16 FOR
WIDECHAR as WCHAR.<BR>
> Perhaps that would be the way to go.<BR>
> However as Rodney M. Bates has said current WIDECHAR is not BITS 16
for UCHAR.<BR>
> It uses LE encoding rather than host order encoding a fact which one
could be quite<BR>
> happy about when it comes to extend Trestle/X11 for widechar
support. So even that<BR>> would fail when it came to interface with X11 (or otherwise one
would have to maintain<BR>
> two branches of code all the time; one that does byte swapping and
one that does not<BR>
> depending on the host order AND the internally used wchar order
which could then <BR>
> differ as well.).<BR><BR>I think what I wrote was maybe misleading. In talking about endianness, I was talking <BR>about when strings are written to/read from streams (Wr.T, Rd.T), which also might be<BR>called "wire representation". These are byte streams, and when a 2-byte (or more)<BR>value is written, it matters whether the least-significant or most-significant byte is written<BR>first. Existing M3 streams, pickles, netobjs, etc. always write these LE, regardless of the<BR>native endianness of the machine. <BR><BR>But maybe more relevant to using a GUI library from M3 code, if you just pass a pointer<BR>to an array of 2-byte scalar values to other code on the same machine, endianness will<BR>not matter. Both programs will address its zero-byte, and will both apply the same<BR>interpretation of whether this is the LSB or MSB. I am sure this will apply to any language<BR>or compiler on the same machine. It will apply to any single scalar value of any natural<BR>byte size and arrays thereof. <BR><BR>I don't know anything about which way various GUI libraries expect to pass strings. <BR>
</DIV>