Peano arithmetic is not just addition and multiplication. It explicitly contains recursion, and thus it’s not at all surprising that it’s Turing complete.
In Peano arithmetics there are statements and there are proofs. The explicit recursion (well, induction) is only on the level of proofs. You can write a statement that only uses addition, multiplication, zero, one and quantifiers that expresses a Turing complete notion — without relying on explicit notion of recursion. The proof that the statement reallt expresses this notion will use induction, but the statement itself is already surprisingly Turing complete.
What you say is true but I’m not sure that it’s relevant. It doesn’t mean anything for “Peano arithmetic to be Turing complete” without the system of axioms that go along with it. I might as well say that “The natural numbers are Turing complete” because natural numbers biject with Turing machines.
Well, all the examples of surprising Turing completeness in the list are not about proofs not requiring induction, they are about a language with few notions being able to express a Turing complete statement. You could say «addition and multiplication in the standard model of natural numbers» (implicitly allowing more powerful proofs) instead of «Peano arithmetics», would it make Turing completeness any more or less interesting? The construction would be the same, the proof would probably be a bit simpler.
MMU description also has recursion in the definition (to read a page, read the page with the page table), does it make its Turing completeness completely unsurprising?
Maybe I’m just far too familiar with Peano arithmetic to find it surprising. What is more suprising to me is that Presburger arithmetic (which is addition plus induction) is not Turing complete.
I find it more interesting that exactly the same signature (+,*,0,1,=) is Turing-complete over N or Z, but not over R or C.
Of course it is completely natural in both cases: Presburger arithmetic doesn’t have anything that grows fast enough, and standard ways to express arbitrary computation require some discreteness.
Peano arithmetic is not just addition and multiplication. It explicitly contains recursion, and thus it’s not at all surprising that it’s Turing complete.
In Peano arithmetics there are statements and there are proofs. The explicit recursion (well, induction) is only on the level of proofs. You can write a statement that only uses addition, multiplication, zero, one and quantifiers that expresses a Turing complete notion — without relying on explicit notion of recursion. The proof that the statement reallt expresses this notion will use induction, but the statement itself is already surprisingly Turing complete.
What you say is true but I’m not sure that it’s relevant. It doesn’t mean anything for “Peano arithmetic to be Turing complete” without the system of axioms that go along with it. I might as well say that “The natural numbers are Turing complete” because natural numbers biject with Turing machines.
Well, all the examples of surprising Turing completeness in the list are not about proofs not requiring induction, they are about a language with few notions being able to express a Turing complete statement. You could say «addition and multiplication in the standard model of natural numbers» (implicitly allowing more powerful proofs) instead of «Peano arithmetics», would it make Turing completeness any more or less interesting? The construction would be the same, the proof would probably be a bit simpler.
MMU description also has recursion in the definition (to read a page, read the page with the page table), does it make its Turing completeness completely unsurprising?
Maybe I’m just far too familiar with Peano arithmetic to find it surprising. What is more suprising to me is that Presburger arithmetic (which is addition plus induction) is not Turing complete.
I find it more interesting that exactly the same signature (+,*,0,1,=) is Turing-complete over N or Z, but not over R or C.
Of course it is completely natural in both cases: Presburger arithmetic doesn’t have anything that grows fast enough, and standard ways to express arbitrary computation require some discreteness.