1. 14
  1.  

  2. 4

    I found these slides while wandering around (in a daze) after watching this video: https://www.youtube.com/watch?v=w-I6XTVZXww

    It seemed completely baffling that doing a sum: 1 + 2 + 3 + ... could yield -1/12. After reading Everything and More, it is as interesting to hear peoples' reactions to notions of infinity as it is to ponder the mathematics itself. I’m also less willing to trust my intuition when it comes infinity. The Wikipedia page about the sum of the natural numbers starts off easily enough, but then veers off into some really heady reasoning by Ramanujan & finally pulls in the zeta function.

    I just wanted to share this interesting gem with others.

    1. 6

      FWIW, I think it’s a mistake to think of the equation

      1+2+3+… = -1/12

      as relating to notions of or intuitions about infinity. Rather, we begin by studying the complex-valued function

      \sum_n n^{-s}.

      Where is this defined? Whenever the series converges, i.e., when the real part of s is greater than 1. Next, by a process called analytic continuation, we define a function, the Zeta function, which has the property that

      Zeta(s) = \sum_n n^{-s}

      whenever Re(s) > 1. A key fact of complex analysis is that any two smooth (complex-differentiable) complex functions which agree on any disk, no matter how small, must be equal everywhere they are defined. This tells us that there is only one possible extension of our series to the entire complex plane, namely the zeta-function we defined.

      Finally, one can show that Zeta(-1) = -1/12. If we were to plug in s = -1 to our series, we’d get

      1+2+3+…=-1/12,

      giving the “formula” we wanted, but this doesn’t have much to do with notions of infinity. It’s just a way to assign a value to a power series.

      P.S.: Thanks for posting this — it’s great to see the string-theory perspective on the number theory.

      1. 0

        That does it. Math is broken.

        Edit: Indeed it is broken! From Wikipedia:

        In particular, the step 4c = 0 + 4 + 0 + 8 + · · · is not justified by the additive identity law alone. For an extreme example, appending a single zero to the front of the series can lead to inconsistent results. [1]

        That paragraph is followed by some insanity, then this:

        A summation method that is linear and stable cannot sum the series 1 + 2 + 3 + … to any finite value. (Stable means that adding a term to the beginning of the series increases the sum by the same amount.) […] The methods used above to sum 1 + 2 + 3 + … are either not stable or not linear.

        The idea that 1 + 2 + 3 plus more non-zero numbers sums up to something less than zero is simply utter nonsense.

        Wikipedia has yet another gem:

        In the primary literature, the series 1 + 2 + 3 + 4 + ⋯ is mentioned in Euler’s 1760 publication De seriebus divergentibus alongside the divergent geometric series 1 + 2 + 4 + 8 + ⋯. Euler hints that series of this type have finite, negative sums, and he explains what this means for geometric series, but he does not return to discuss 1 + 2 + 3 + 4 + ⋯. In the same publication, Euler writes that the sum of 1 + 1 + 1 + 1 + ⋯ is infinite.

        So there you go, 1 + 2 + 3 + … is a negative number, but 1 + (1 + 1) + (1 + 1 + 1) + … is infinite.

        Right. Yeah.

        If some system of math tells you otherwise, I think it’s more likely the case that there’s some sort of logical error in reasoning within that system.

      2. 2

        This is a joke? I looked at 1 + x + x^2 … = 1/(1-x) which I know to be wrong. Then the next slide has it’s derivative with the minus sign missing and it gets more bizarre after that.

        1. 3

          No, it’s all correct. 1 + x + x^2 … is indeed 1/(1-x) and differentiating gets an extra - sign due to the -x.

          1. 1

            It’s not true for all values of x for sure. x=2, for example. Is the restriction -1 <= x <= 1 ?

            You are right about d/dx (1-x)^-1. I did (1+x) in my head by mistake.

            1. 3

              It’s an equality of formal power series. The left-hand side converges for |x|<1. Moonshine theory is indeed quite bizarre, though.

              1. 1

                Wow! Thanks for the link. It’s interesting that we can have a restricted equality! I must have been taught this at some point, but it’s amazing for me to see this today.

        2. 2

          I’m stuck on this slide:

          Then Euler considered this function:

          ζ(s) = 1^{-s} + 2^{-s} + 3^{-s} + 4^{-s} + · · ·

          He multiplied by 2^{−s}:

          2^{−s} ζ(s) = 2^{-s} + 4^{-s} + 6^{-s} + 8^{-s} + · · ·

          Shouldn’t the result of that be: 2^{-2s} + 4^{-2s} + 6^{-2s} + 8^{-2s} + · · · ?

          The product of 1^{-s} and 2^{-s} is not 2^{-s}.