1. 11
  1.  

  2. 7

    I don’t get it, is this something that only matters for OOPy languages? Kind of Design Patternish, isn’t it? That book always struck me as trying to patch Java.

    I mean, I understand you could emulate what they’re talking about in languages other than Java and C#, but… huh? Do we really need to avoid if blocks that badly?

    1. 3

      I’ve done the same sort of thing in Scheme (and other non “OO” languages) in the past. Rather than an if block, use polymorphic functions that are supplied. It takes a little getting used to for the reader but the testability of the code is much higher. I’ve also done the same thing in other languages using various forms of type-level programming.

    2. 3

      Pretty sure the more recent anti-if articles was over branchless processing. That then cuts out branch prediction leakage, branch bugs(spectre and meltdown), branch cache misses, and more.

      Pretty sure switch statements, adding bool’s, and such are just variants of “if”, which then has all the problems associated with branching.

      Article going further indepth: https://hbfs.wordpress.com/2008/08/05/branchless-equivalents-of-simple-functions/

      1. 2

        I think I was on board until pattern 4, when I realized that if you start with deliberately terrible code, rewriting in any alternative style will be an improvement. But is that really the best effort an if/else programmer could make? Or is it a straw man?

        1. 1

          Pattern 4 looks like code golfing or writing for the computer instead of human readers. The “before” code looks shitty, but I don’t need to recall operator precedence rules or model a truth table in my head to understand what it does.

        2. 2

          Thinking Forth devotes an entire chapter to the “anti-if” pattern (Chapter 8 for the curious). Even though it’s Forth, the advice it gives is solid and is one of the few books that actually changed how I program (even though I never program in Forth).

          1. 2

            This problem is solved beautifully by multimethods in Clojure. For example, you could do:

            (defmulti get-speed :type)
            
            (defmethod get-speed :european [bird]
              (:speed bird))
            
            (defmethod get-speed :african [bird]
              (- (:speed bird) (:load-factor bird)))
            
            (defmethod get-speed :norwegian-blue [bird]
              (if (:nailed bird) 0 (:speed bird)))
            
            (defmethod get-speed :default [_]
              0)
            

            The really handy part is that the multimethod is open, so anybody can implement their own defmethod for a new case without having to change any existing code. This makes it a really useful tool for providing extensible library APIs. I use this pattern in my reagent-forms UI component library that provides an init-field multimethod that can be used by the users to implement their own custom widgets.

            1. 1

              So one place anti-if can go is the one instruction set computer. If that instruction is implemented branchlessly, you guarantee full branchless computing. Otherwise, you can do minimum branching computing. See: https://esolangs.org/wiki/OISC

              This sort of consideration may be seem academic. But there is actually another place this consideration comes in. GPU computing (basically SIMD) involves the principle that every branch has significant cost. However, if you implement an interpreter for minimum instruction set language, then you can run distinct interpreters on each thread separately at low cost.

              This is more or less the principle behind H. Dietz’ MOG, “MIMD on GPU”, a system that can compile “arbitrary” parallel code to run on a GPU with only a x6-1 slowdown (Unfortunately, the project is frozen in beta for lack of funding).

              See: http://aggregate.org/MOG/