[M3devel] Enumeration or subrange value out of range

Michael Richter ttmrichter at gmail.com
Thu Dec 2 07:36:05 CET 2010


On 1 December 2010 14:49, Jay K <jay.krell at cornell.edu> wrote:


>  >  *not* proper in terms of semantics (operator<< and operator>> being
> obvious cases)
>


> This would not be allowed in my constrained proposal.
>

How would you stop it?  How would you enforce a rule that says "<<" means
that you are somehow shifting something to the left, even if the
representation isn't a binary integer?  How would you enforce a rule that
says "+" must add, not concatenate, subtract, multiply or reset the
computer?


> > but also operator+ for concatenation
>


> This is very reasonable.
>

No, it is not.  The semantics of concatenation are hugely different from the
semantics of addition.  Consider 3+4 vs. 4+3.  Now consider "three"+"four"
vs. "four"+"three".  If you want to do concatenation, have an operator for
concatenation.  (++, say, as a random example.)  Do not overload addition
with an operation that isn't even vaguely analogous.


>  > utterly painful uses for operator,)
>


> I'm surprised that is overloadable, but indeed it appears it is. I don't
> think I have *ever* seen anyone overload it, and I have seen a lot.
>

I would tell you how I used it, but I'm utterly ashamed of that portion of
my life.  ;)  (It was in a type-safe SQL query builder.)


>  > the inability to define *new* operators (which leads to the first
> problem of idiots redefining the semantics of operators)
>


> Stroupstroup rejected this as too complex. (See the Design&Evolution book).
>

Stroustrup is, not to put too fine a point on this, a gibbering lunatic.  He
rejected defining new operators as "too complex" -- yet created a language
that to this day is nigh-impossible to implement according to specification
(a situation that will be made even worse with the never-arriving C++1x).
 In the meantime languages far simpler than C++ (syntactically, at least)
have had the ability to define new operators at will for ages.  Haskell, for
example, as a top of my head instance.  Prolog too, if memory serves.

Personally, I'd much rather try to implement a parser for Haskell or Prolog
or languages in that vein than I would a C++ parser.


> I don't see people pine for this often and I suspect he did the right
> thing.
>

People don't pine for it often because people don't pine for capabilities
they don't know exist.  Most people don't pine for hygenic macro systems
either -- until they use them in languages that provide that kind of
capability.  People didn't pine for RTTI in C++ either ... until it became
available in some compilers and other compilers seemed far more constrained.
 People didn't pine for automated memory management ... until Java came
along and made it mainstream.  People didn't pine for multi-threading ...
until everybody and their dog supported it.  Hell, people didn't pine, by
and large, for *operator overloading *until C++ made it mainstream!

What people pine for vs. what people need or at least can truly use are such
radically disjoint sets that I often use what people pine for as a hint for
what to *avoid*.


> It creates a layering problem I believe in the typical structure.
> The lexical analysis would have to get information from higher layers.
>

Or you could go the Haskell route.  Any token that consists of purely
"special symbols" is an operator that can be used directly as an operator.
 To use it as a regular function you need to wrap it in parens.  This means
I can define, say, a "UFO operator" like <=+=+=+=> (made deliberately gonzo
for fun) and use it as-is--e.g.: Expr1 <=+=+=+=> Expr2--or as a
two-parameter function call--(<=+=+=+=>) Expr1 Expr2.  Further you can go
more down the Haskell route and make any function usable as an operator:
myFunc Expr1 Expr2 is equally usable as Expr1 `myFunc` Expr2.

Can you see how lexical analysis can trivially identify your operator
definitions and usage here?


>  > unexpected costs to operations making the eyeballing of execution
> complexity (time-wise and memory-wise) literally impossible
>


> This is already the case. As I said. So let's say that every single
> function call is shown. It is hard to know which functions have which cost.
> There are also hidden function calls e.g. for "try" and every pointer
> derereference (right?)
>

Operator overloading can multiply these by orders of magnitude and have the
added problem of being, in effect, "COME FROM" statements.  I mean they're
not as bad in this regard as, say, Aspect-Oriented Programming, but they're
still pretty nasty.  The fact that features X and Y have hidden costs is not
really a good argument for adding feature Z that has all those hidden costs
and an order of magnitude more all concealed in such a way that it is almost
impossible to keep track of where things are really going.


> Please consider floating point. Historically floating point was "soft
> float". Sometimes these days it still us. Yet we still have operators for
> floating point.
> Why? Because it is just so convenient and idiomatic. Why stop there?
>

Because at some point the costs outweigh the benefits.  Operator overloading
is at just that cusp point for me to be torn.  It has some major benefits,
but it has major costs and right now I actually lean slightly in the
direction of thinking the costs outweigh the benefits.  (A few years ago I
leaned slightly in the other direction, mind, so it, as a feature is not
something I'm going to reject a language for.)


> A primary design point of C++ is to give user defined types all the powers
> of built in types.
>
No longer does it require a compiler change to introduce a type with the
> "power" of int. And so on.
>

Instead it allows you to introduce a type that looks like an int but has
such wildly varying semantics that it will confuse the ever living daylights
out of people using it.  Like using "+" for concatenation.


> > painful interaction with templates that makes a perfect storm of
> eye-damaging syntax
>


> Huh? Specifically?
>

My C++ days are a decade behind me so I no longer have any source that
illustrates this.  I just recall that any time we had a templated class with
overloaded operators that it turned into an asinine stew of unreadable code.


> The one vague reason I don't fully understand is: C doesn't have it.
> Does C represent a good example of a sort of minimalism? Maybe.
> It isn't clear to me the value of C. It has been *very* widely abandoned in
> favor of C++.
>

What is the FFI lingua franca again?  C or C++?  (Hint: one of those two
languages has syntax to make it compatible with the other, but not vice
versa.)

-- 
"Perhaps people don't believe this, but throughout all of the discussions of
entering China our focus has really been what's best for the Chinese people.
It's not been about our revenue or profit or whatnot."
--Sergey Brin, demonstrating the emptiness of the "don't be evil" mantra.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://m3lists.elegosoft.com/pipermail/m3devel/attachments/20101202/5d2ca3a1/attachment-0002.html>


More information about the M3devel mailing list