1. 10
  1.  

  2. 4

    This is an outright fun paper. It’s serious (obviously Knuth cares about the subject) but not too serious. Instead of being laser-focused, it’s happy to just sit the reader down and say “here, let me show you some reasons why I think this works out well”, and isn’t afraid to make a few digressions along the way. I don’t mind a bit of formalism, but a holistic style is good for making math accessible to people who aren’t too deep into any particular rabbit-hole.

    1. 2

      So Knuth didn’t adopt another Iverson’s notation that ‘1 = 1 = 0 ÷ 0’?

      1. 2

        I’m simultaneously educated and greatly confused.

        If α and β are arbitrary entities and ℝ is any relation defined on them, the relational statement αℝβ is a logical variable which is true if and only if α stands in the relation ℝ to β. For example, if x is any real number, then the function (x > 0) - (x < 0) assumes the values 1, 0 or -1 according as x is strictly positive, 0 or strictly negative.

        Both these sentences make sense in isolation, but I don’t understand how they connect. What are α, β and ℝ in the example?

        I understand most of the next 7 pages. But I don’t understand how the beginning connects up with them, and why Knuth keeps referring to this notation as “Iverson’s convention.”

        1. 3

          Kenneth Iverson, of APL and J fame, invented the convention of using 1 to represent true and 0 to represent false, and then allowing all the ordinary integer operations on them. This is still how both J and APL work today. The 2nd sentence from your quote shows how this convention allows you to easily define the “signnum” operator on any x as (x > 0) - (x < 0). In what follows Knuth lists other (unexpected but happy) advantages of this notation.

          Does that answer your question?

          1. 3

            Ohh, I elided the “(equal to 1)” when transcribing, but it’s load-bearing. Thanks!

          2. 3

            The expression (x > 0) - (x < 0) actually contains two examples. In the subexpression x > 0, we have α = x, β = 0, and ℝ is the relation “greater than” on numbers. In the subexpression x < 0, we have α = x, β = 0, and ℝ is the relation “less than” on numbers.

            1. 2

              Thank you!

          3. 1

            I have trouble convincing mathematicians that these little notation improvements matter, and will make our lives easier in the long run.

            1. 3

              Have you tried recommending Iverson’s Notation as a Tool of Thought?

              1. 1

                No, but if I did, they wouldn’t care. Mathematicians put a lot of thought into how to convey abstract ideas efficiently, but much less so into how to make those boring and tedious calculations less error prone. Also, the idea that a well chosen notation can lessen the mental burden of performing tricky calculations is basically heresy.

                1. 2

                  a well chosen notation can lessen the mental burden of performing tricky calculations is basically heresy.

                  Why is that?

                  1. 2

                    Because, supposedly, once you get the “big ideas” (granted, ideas are a pretty big deal in mathematics), you would not be bothered with such “trifles” as

                    • “Your formulas are peppered with +1s and -1s, making them an invitation to make off-by-one errors.” “Who cares, dealing with calculation details is routine.”

                    • “Your definition works in all cases except the trivial one.” (e.g., it works for all sets, except the empty set) “Who cares, the trivial case is boring.”

                    Fortunately, this attitude will change when more mathematicians learn to program. It will inevitably happen.

                    1. 2

                      Interesting. I’m sympathetic to prioritizing big ideas and hand-waving over piddly details.

                      But imo that is precisely the reason to care about notation advances like the ones Knuth is promoting. They allow you to spend less time thinking about those unimportant details, both when reading and writing mathematical formulas.