1. 10
  1.  

  2. 3

    These look modest in size but very definitely not modest in impact.

    1. 3

      “modest proposal” is an ironic phrase alluding to https://en.wikipedia.org/wiki/A_Modest_Proposal.

      1. 2

        A great text!

        I’ve always considered it as the original inspiration for the old propaganda about Communists eating children.

        I wonder to what length one could go writing a similar paper, in Latex, today.

        It’s definitely an hack to try.

    2. 1

      I think the core problem is this assumes that the compiler writers can anticipate and predict and control what the behaviour of the code they generate will be for all “undefined behaviours”.

      Even worse, the very definitions of …

      “ignores” the situation completely

      and

      “behaves in a manner characteristic of the environment”

      …are vague beyond usefulness.

      Those are not definitions that you could implement and test for compliance and rely on in portable programs.

      Those are a handful of hand waving examples.

      If you could reframe that into rules that you could test for compliance on…. maybe.

      I think by the time you have something that a portable program could rely on…. it will no longer be undefined behaviour.

      Take for example that simplest of functions.

      Calculate the square root of float x.

      There are probably thousands of implementations of this around the world.

      Let’s break it into two parts.

      An outer function that checks for a negative argument and returns something defined. eg. NaN or an error code or throws an exception or something.

      And an inner function inner_sqrt( float) that assumes that check has been made and has a non-negative float to work with.

      Now assuming….

      • I am a wondrous smart ass,
      • and assuming I believe my code to be perfect so I don’t need the outer check
      • since I’m perfect and never create negative parameters
      • and since I’m invoking this in an inner loop I elect to avoid the unneeded check

      … I can elect to invoke the inner_sqrt directly.

      Now assuming I’m less smart than I believe myself, and do indeed invoke “inner_sqrt( -1.0)”

      So what behaviour can I rely on for my stupidity?

      Ignore it? Well some implementations will crash, some will go into an infinity loop, some will produce NaN’s, some will produce random garbage.

      What exactly do you mean by “ignore it”? Do the check again? So you have an inner and an outer function that both do the same check?

      Document it? So you expect the implementers of inner_sqrt() to have a regression test suite that checks the behaviour in response to all classes of possible stupid, is as documented for that particular class of stupid? (The sqrt example is simple, a more complex example may have hundreds of possible classes of stupid!)

      Terminate with error message? What!? Do the check again? That’s what the outer function was for! You elected in your infinity wisdom to optimise by invoking the inner function without the check because you’re perfect and don’t write bugs!

      The core here is “undefined” means just that. It is not defined, nor can it be.

      If you, in your God Given power as Lord High Absolute Programmer elect to override the protections given you by the cpu or compiler or library implementer and do stupid… there is nothing a compiler or anyone can do to save you from your stupidity.

      Nor, I would argue, should it.

      In the case of signed int overflow, I would argue rather if there is a flaw in the standard, it is because the standard elected to give us no protections that we could opt out of.

      Yes I can see that checking every signed int overflow has a high cost, and I can foresee in certain cases I personally would indeed opt out of such protections.

      Alas, the state of the C/C++ is that I cannot even opt in to such protections. (Although I can pull in BigNum libraries if I choose…. but if I choose to abuse those, they will also bite me.)

      1. 1

        Unfortunately, some implementations are interpreting undefined behavior to license arbitrary transformations

        I don’t think this is true. The transformations which make use of undefined behaviour for optimisation transformations are not arbitrary, but limited; they cannot change the semantics of a conforming program.

        As an example, some implementations will roll-over on arithmetic overflow, but may delete programmer checks for roll-over as an “optimization”. The wording of the Standard does not support this interpretation: “possible undefined behavior ranges from”

        • ignoring the situation completely with unpredictable results

        The post argues that wrapping integer arithmetic (a natural consequence of 2’s complement representation) while optimising out checks for that roll-over (since if it occurred, it had undefined behaviour) is not the same as “ignoring the situation completely with unpredictable results”. It seems to me that “ignoring the situation completely with unpredictable results” exactly describes this common compiler behaviour, though. That is, it removes the wrapping check as an optimisation based on the assumption that the condition it checks can’t legally be true, and then ignores the case when the wrapping does occur, with the unpredictable result that the subsequent check apparently ceases to evaluate correctly.