1. 10
  1.  

  2. 12

    No, it doesn’t: at least, not in conventional mathematics. The OP is right that posting this “fact” without explanation creates the impression that math is “deep magic” rather than a system of logic where everything requires proof.

    This is more of a mathematical joke or pun that has been taken too literally. The sum 1 + 2 + 3 + … diverges. Since it doesn’t approach a value, it has no value according to classical real analysis, in the same way that sin(∞) has no value. Sometimes you can “patch” mathematics by giving extended definitions that make sense on one domain but not on others. For example, in complex analysis, it’s useful to treat 1/0 as an unsigned infinity (point at infinity) and redefine continuity (Riemann sphere) and it works well. In other areas of mathematics, this handling of 1/0 doesn’t work.

    It’s correct that Zeta(-1) = -1/12, but the accurate conclusion is that Zeta(s) = 1/(1^s) + 1/(2^s) + … is an identity that only holds when the sum is defined. The zeta function is defined everywhere except at s = 1 (a pole) but the sum is not.