1. 8
    1. 9

      But while loops are not reducible to for loops:

      Are they not?

      for (;condition;) { … }

      1. 3

        I know Haskell can fuse maps together as an optimization but I don’t think you safely fuse arbitrary map/filters?

        Check out A Short Cut to Deforestation! It explains how to fuse any sequence of maps and filters together.

        [reduce is] the hardest for people to understand, and Haskell even has three versions of it. Is it confusing because it’s so powerful?

        foldr is the fundamental one. I think the short-cut paper actually implements the other folds in terms of foldr.

        foldr is fundamental because it works by constructor replacement. A list [1,2,3] is really a tree made of Cons nodes with a Nil sentinel at the end: (Cons 1 (Cons 2 (Cons 3 Nil))). If you replace Cons with plus, and Nil with zero, you get the sum: (add 1 (add 2 (add 3 0))). Like this diagram.

        The short-cut paper works by saying: “Hey if you’re producing a list, but your caller is just going to sum it… you should call ‘plus’ any time you would have called ‘Cons’.” This is the same trick Lark uses to avoid building an intermediate tree: the framework calls the user’s parsing actions instead of building tree-nodes.

        1. 3

          I know Haskell can fuse maps together as an optimization but I don’t think you safely fuse arbitrary map/filters?

          Check out A Short Cut to Deforestation! It explains how to fuse any sequence of maps and filters together.

          there are a couple of potential problems with fusion:

          1. if there are side effects, then their order must be preserved (haskell gets around this by not having side effects)
          2. in general, for a graph of stream processing pipelines, there are essential tradeoffs w.r.t. space usage

          furthermore, what you frequently want is not per se ‘fusion’ but ‘tiling’ (‘fusion’ is simply projecting a multi-dimensional structure onto a stream-processing problem and then choosing an appropriate order of traversal of that structure; ‘tiling’ adds additional dimensions before choosing how to traverse, which expands the space of expressible schedules)

          of course, none of this is actually a difference between ‘loops’ and higher-order stream-processing functions. a specification is not a schedule

          1. 2

            One more thought about loops vs folds:

            If foldr is so fundamental, why is it backwards?

            To compute a sum imperatively I’d do it left to right:

            s = 0
            for i in input:
              s += i
            return s
            

            Whereas foldr would recurse down the list, then start with 0 at the very bottom, and do the additions right-to-left as it returns. Because it follows the structure of the input: it finds the sum of every sublist along the way.

            The imperative version also finds the sum of a bunch of sublists: it finds the sum of every prefix along the way. So maybe Snoc lists are a better way to think about imperative arrays: [1,2,3] is ((([] : 1) : 2) : 3). The natural way to fold these trees (constructor replacement) starts with the empty prefix, then goes left to right.

          2. 3

            (Question: do list comprehensions lead to a new paradigm?)

            List comprehensions desugar to map and filter.

            Map and filter are composable in a way that foreach loops are not, but they’re also weirdly less “composable” in one very specific way? Consider something of the form

            map f (filter p arr))
            -->
            out = []
            for x in arr:
              if p(x):
                out.append(f(x))
            

            The map/filter iterates over the list twice, while the foreach loop iterates only once. I know Haskell can fuse maps together as an optimization but I don’t think you safely fuse arbitrary map/filters? I dunno.

            But fusion is about removing boxing/unboxing. Haskell actually “iterates” over the list once, because of laziness. The whole point of laziness/call-by-need is that it gives you the-same-or-better time complexity of an imperative program, while writing code that uses high level functions that operate on the entire data structure at once.

            (laziness as a way to allow composability with high level functions goes back to the original Why functional programming matters paper)

            “Iterators” as a general concept.

            Iterators are a restricted, opt-in, form of Haskell’s laziness.

            1. 1

              Given a foreach loop, I can syntactically translate it into a forstep loop:

              This conversion only works for constant-indexed sequences.

              You can’t convert this to a foreach loop: it’s an “irreducible” feature.

              You can, as long as the source (iterator) supports resetting e.g. you can probably do that with C++’ random access iterators.

              So much so that you can distinguish different programming paradigms by how they iterate.

              OOP: a foreach method on iterable objects[^python]

              I don’t know what the footnote is supposed to be, but as far as I know this is just plain wrong: it seems to imply OO languages use internal iteration, most don’t, functional languages very much do, and so do languages which are neither (e.g. Go’s rangefuncs).