<html><head></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><br><div><div>Am 29.05.2014 um 17:26 schrieb Rodney M. Bates:</div><br class="Apple-interchange-newline"><blockquote type="cite"><div><br>I am responsible for libunicode.<br><br>1. libunicode won't and is not designed to build unless the compiler is<br> configured to make WIDECHAR have full unicode range, which, by default,<br> it is not.<br><br>I put libunicode in a separate package for that reason, and left the compiler<br>configured by default for the existing 16-bit range of WIDECHAR, so there<br>would be no perturbation to anybody's code unless you take some action.<br><br>We can change the default if there is consensus to do so. Most code should<br>not be affected, but some lower-level things will be. </div></blockquote><div><br></div><div>Well, the program I wanna port would actually profit from both types:</div><div>* a 16-bit unicode type for interfacing with gtk & qt</div><div>* a 32-bit unicode type for the program level user input function yielding math & greek letters</div><div><br></div><div>Will it be possible to simply declare BITS 16 FOR WIDECHAR when unicode support is enabled?</div><div>What complications to the Text library will that cause if we have a 16-bit and a 32-bit character </div><div>type at the same time?</div><div>What would you think about leaving WIDECHAR as 16-bit and rather introducing UCHAR as </div><div>32-bit character type? I believe this would be the best solution as it does not break any existing</div><div>code.</div><div><br></div><div>Best Regards,</div><div>Elmar Stellnberger</div><div><br></div><blockquote type="cite"><div><font class="Apple-style-span" color="#000000"><br></font>Rodney Bates<br><a href="mailto:rodney.m.bates@acm.org">rodney.m.bates@acm.org</a><br><br></div></blockquote></div><br></body></html>