[M3devel] how to represent a 16bit char?

dirk muysers dmuysers at hotmail.com
Sat Nov 30 10:34:47 CET 2013


Yes, I think UTF-8 CHAR is the way to go. Nearly all recent Un*x libraries
use it by now. Only Microsoft sticks to CHAR16  nowadays. I have a
UTF-8 TEXT implementation in the making and don’t have the time,
these days, to complete it. It has a constant time cursor to sequentially
access this encoding, support for character properties, dynamic 
text building and formatting, and NLS (National Language support),
and will have an XML reader. Finished: TEXT implementation, and char
props. Partially finished: Dynamic buffer and NLS. Unfinished so far:
Formatting and XML. The software is called libunicode.
As for Win32, most software use dynamic translation to CHAR16,
when required, eg Component Pascal for .Net, GTK, golang etc.
leaving it the craggy island it is.

From: Jay K 
Sent: Saturday, November 30, 2013 10:06 AM
To: m3devel 
Subject: [M3devel] how to represent a 16bit char?

1) Ok for purposes of interfacing with Win32 and Xlib, what should I use where WIDECHAR used to be correct?
2) Are we really certain that redefining WIDECHAR is the way to go?
Not, say, introduce a new time, CHAR32 or UCHAR32?
And maybe add an explicit alias CHAR16 or UCHAR16 to provide a type that nobody will ever consider changing?
 
Or do people now advocate: 
get rid of WIDECHAR 
leave 8 bit CHAR 
with a new understanding that it is UTF-8 encoded, and force lots conversion back and forth? 
?? 
 
Thank you,
- Jay



 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://m3lists.elegosoft.com/pipermail/m3devel/attachments/20131130/fd20f5a6/attachment-0002.html>


More information about the M3devel mailing list