[M3devel] long vs. INTEGER? ranges vs. word/integer?

Peter Eiserloh eiserlohpp at yahoo.com
Sat Jan 3 07:52:26 CET 2009


Jay,

Please don't make size_t 128 bits in size.  In "C" a
long is specified by the language spec to be an integral 
type big enough to hold a pointer.  Longs in "C" may be
bigger than a pointer, but they must be at least that size.  

NOTE: an int (integer) does not have that guarantee.

In fact, I once programed on a platform (Amiga) that had
two different compilers one used 16-bits and other 32 for
their integers.  You can imagine the difficulties.

Windows machines, as you have been using, give an address
space of 32 bits to user space.  This is regardless of how
much RAM it may actually have.  I would expect that on a
Win64 platform, pointers are 64 bits, and "long" is also 
64 bits.

On my AMD64_LINUX box, pointers are 64 bits, and therefore
a long is 64, and a long long is 128.  The system type
size_t is 64 bits.  If you attempt to map Cstddef.size_t
to 128 bits it will not only break the syscall interfaces
to the kernel, but also be a waste of space.

>From that standpoint the type definitions
   size_t = Ctypes.unsigned_long;
   ..., etc.

are correct.

Please don't define    
   size_t = Ctypes.unsigned_long_long;


The "C" include files
   /usr/includes/bits/types.h
   /usr/includes/bits/typesizes.h

play preprocessor games to ensure that the ssize_t are
defined to use the "natural" word size.  

Other types get defined to fixed sizes (ie, _int32 must
always be 32 bits) otherwise external interfaces (file
formats, inodes, ..., etc) would be corrupted.  For that
reason explicitly sized types need to be defined.

Actually, GNU/Modula-2 recently defined explictly sized
types.


Peter Eiserloh.



Date: Fri, 2 Jan 2009 23:05:24 +0000
From: Jay <jay.krell at cornell.edu>
Subject: [M3devel] long vs. INTEGER? ranges vs. word/integer?
To: m3devel <m3devel at elegosoft.com>, Tony <hosking at cs.purdue.edu>
Message-ID: <COL101-W1612F9835414ED341527DBE6E20 at phx.gbl>
Content-Type: text/plain; charset="iso-8859-1"


I'd like to avoid using "long" and "ulong" anywhere.
On Unix, they are always pointer sized.
On Windows, they are always 32 bits.

This divergence of meaning I think it renders it useless.

I believe for pointer-sized integers, the right types are any of:
  unsigned: size_t, Word.T
  signed: INTEGER, ssize_t, ptrdiff_t
For 32bit integers: int32_t and uint32_t, perhaps int.

There is arguably some ambiguity if you consider 16bit platforms.

Now, I noticed we have:
INTERFACE Cstddef;

  size_t = Ctypes.unsigned_long;  ssize_t = Ctypes.long;  ptrdiff_t = Ctypes.long;

I would like to change this, either to:

32bits:
  size_t = Ctypes.unsigned_int;  ssize_t = Ctypes.int;  ptrdiff_t = Ctypes.int;

64bits:
  size_t = Ctypes.unsigned_long_long;  ssize_t = Ctypes.long_long;  ptrdiff_t = Ctypes.long_long;

or portable:
  size_t = Word.T;  ssize_t = INTEGER;  ptrdiff_t = INTEGER;
but, my question then is, why isn't the portable version already in use?
Especially for the signed types.

I mean, you know, we have:

32bits/BasicCtypes:

INTERFACE BasicCtypes;
IMPORT Word, Long;
TYPE  (* the four signed integer types *)  signed_char        = [-16_7f-1 .. 16_7f];  short_int          = [-16_7fff-1 .. 16_7fff];  int                = [-16_7fffffff-1 .. 16_7fffffff];  long_int           = [-16_7fffffff-1 .. 16_7fffffff];
question is, why aren't int and long_int INTEGER?

64bits/BasicCtypes:

  long_int           = [-16_7fffffffffffffff -1  .. 16_7fffffffffffffff ];  long_long          = [-16_7fffffffffffffffL-1L .. 16_7fffffffffffffffL];

why not INTEGER?



+--------------------------------------------------------+
| Peter P. Eiserloh                                      |
+--------------------------------------------------------+


      



More information about the M3devel mailing list