1. 20
  1.  

  2. 5

    Null removal, and actually the elision of any comparator can be so surprising that I’d rather the compiler tell me that it is doing this and tell me to remove the comparator (or fix other assumptions) instead of removing it.

    1. 3

      It might be worth fighting this lunacy by showing that the performance benefits are negligible. Completely ignoring the safety/unexpectedness aspect, I feel like these optimizations are really worthless anyway. If null checks after memcpy were a bottleneck in my programs, I’d probably notice.

      1. 1

        God save us from overeager PHDs^W compiler contributors.

        1. 1

          I’m sure that 7.1.4 seemed quite reasonable in isolation, but how does it interact with the case where memcpy is called with a zero length? If you read 7.24.2.1 then you might well think that, since the function copies zero bytes, it’s valid to pass NULL as either of the pointer arguments.

          Well you might think that, but you’d be wrong, because 7.1.4 very clearly says the behaviour is undefined.

          1. 4

            Is that not like, the very next sentence of the article?

            1. 2

              Hmmm, I think I completely misunderstood his point. Looking closer, it seems like his point is that if you try to do null pointer checks after a call to memcpy etc, they’ll be optimized away since your call to memcpy was already undefined behaviour, so nasal demons of that nature are just fine. Hmmm.

          2. [Comment removed by author]

            1. 2

              It can be done in C++. It can still dangle, but that is another problem.

              1. 2

                Clang has the __nullable and __nonnull annotations, don’t know if the optimizer takes them into account however.

              2. 1

                My first thought was, this optimization is wrong, because “undefined behaviour” includes “memcpy returns successfully, and the long awaited nasal dæmons never arrive”, thus you can’t assume the pointer is valid at that point. But then I thought, maybe it’s actually correct, and “undefined behaviour” includes “nasal dæmons arrive some time after memcpy returns”. Which interpretation is right?

                I’d suggest that […] the next revision of the C standard change 7.24.1(2) to clarify that when a length is zero, pointers can be NULL.

                I disagree. A not unreasonable implementation of memcpy (e.g., on an architecture where top address lines are not connected, so hardware may not catch all “illegal” addresses) may start with:

                if (n > MEMSIZE ||
                    src < MINADDR || src + n > MINADDR + MEMSIZE ||
                    dst < MINADDR || dst + n > MINADDR + MEMSIZE) {
                        launch_nasal_daemons();
                }
                

                This change would forbid such implementations.

                1. 2

                  “Nasal demons” are defined by LLVM in terms of a concept called “poison”: in simplified terms, poison represents operations that Provably Cannot Occur (either by proof, language definition, or both). Thus, the compiler can assume any path which leads to poison cannot happen. ex: this loop counter uses signed arithmetic, so we know it won’t wrap, so we can assume any control flow path that leads to the loop counter wrapping Cannot Occur.

                  Thus the impact of poison spreads throughout a program; anything in a program that can provably lead to poison can be elided.

                  Not, of course, saying that leveraging this for such subtle C semantics is a good idea.

                  1. 1

                    Things that Provably Cannot Occur sound like dead code, not undefined behaviour, which is more like Things That Can Lead To Anything.

                    1. 6

                      Undefined behavior Cannot Occur, thus anything that leads to undefined behavior Cannot Occur. That’s how LLVM models it.

                      “Can lead to anything” is the simple explanation: the compiler is free to assume that undefined behavior will not happen (even if it does), and optimize your program accordingly (which may lead to it doing pretty much anything in the case where undefined behavior actually did occur). In theory a hypothetical compiler could actually “do anything”, but the way LLVM models it is that it assumes undefined behavior does not happen.

                      Dead code isn’t the same thing: “dead code” is code which cannot be reached. Poison is code which may be reachable, but because it’s poison, we are allowed to assume that any path which reaches it is a priori impossible.

                      1. 4

                        Thanks. Does Not Happen is definitely a better way to think about undefined behavior than Anything Can Happen.

                        1. 4

                          I think a lot of people are misled by Anything Can Happen; they assume the compiler spots undefined behavior, then vomits on your code. It’s actually much the opposite: the compiler blindly assumes the undefined behavior can’t happen, and if you happened to rely on it, unexpected things may happen.

                  2. 1

                    Such hypothetical implementations can be fixed by adding a check for n == 0.

                    Note that undefined behavior can flow backwards. If your program is undefined at some point, it’s also undefined at all prior points that inevitably reach that point.

                    1. 1

                      Sure they can be fixed, I just don’t think it’s worth redefining the current implementation as broken.

                      Backwards? Shock horror. Well, I can understand it’s useful for code re-arrangement. But this makes legal optimizing out the print-out in this code fragment:

                      if (dst == NULL)
                              puts("gonna crash");
                      memcpy(dst, src, n);
                      
                      1. 3

                        This is legal to optimize out because puts() is guaranteed to return (iirc, it’s a function attribute), so the poison is guaranteed to be reached.

                        However, I don’t believe this can legally be optimized:

                        if (dst == NULL) unknown_function(“gonna crash”); memcpy(dst, src, n);

                        because unknown_function could call exit() or such instead of continuing on.

                        1. 3

                          The C compiler must create a program which produces the same result as if executed on the abstract C machine. BUT ONLY if the source is a valid program. Invalid programs do not produce defined results on the abstract machine, therefore the compiler can do anything it wants.

                          One can imagine an implementation that prints output to paper tape and catches fire if it derefs null, burning the tape and therefore ultimately producing no output.