1. 19
  1.  

  2. 22

    It’s not just this field, and it’s not just Turing. It’s just something that people have a tendency to do, especially when someone has a whiff of tragic genius about them. Turing, Lovelace, Galois, Noether, and Tesla knew everything and did everything, and if they didn’t, well, they could have!

    It’s just another aspect of pop-science-ization, and while it’s annoying, the flip-side is: at least people give a shit at all.

    1. 7

      Honestly, I’ve always felt that Schmidhuber oversells himself and his research rather egregiously.

      1. 3

        It seems like this guy really, really wants computing to have been founded by German speaking scientists.

      2. 7

        To me this article actually lays out all the significant contributions he’s made, and deserves to be credited for. The rest is just the typical “shoulders of giants”.

        1. 3

          See also: Stephen Hawking. Made important contributions to theoretical cosmology, but I can’t imagine any physicist putting him in the top 10 theoretical physicists of the last half-century.

          1. 2
            1. 1

              This is a fantastic presentation! I’ve been reading the first part over the weekend. Would you post it as a separate story? If you do, please link from here so I don’t forget to upvote. (And if for some reason you don’t want to, I’ll post it.)

                1. 1

                  Great!

            2. 2

              Sure but consider, without Turing you wouldn’t have a website on which to publish this. Turing’s model was the first actually implementable description of computing. There was a clear path from the formal description of a “turing machine” to the actual implementation. Now, as for Zuse not being considered, well, there was the whole thing of “The War”. His work wasn’t very well known outside of Germany, and “Much of his early work was financed by his family and commerce, but after 1939 he was given resources by the Nazi German government”.

              So the fact is that the paper containing the description of what a “Turing Machines” is, forms the actual basis for modern computing in the western world. I think he can be credited with that – which by itself is a stunning achievement.

              1. 3

                That’s really not true. The TM formalism had very little to do with the design of real machines, it definitely didn’t make them possible. No one was saying “I want to crunch some numbers, but I need to prove that my machine is capable of computing every computable function before I start”; they just said “I have a bunch of arithmetical operations I want to do, how can I automate that up a little bit?” The seeds for that were planted almost 100 years before Turing was born; the technology caught up with the ideas in the 1930s-40s, and the approaches that paid off had nothing really to do with symbol-rewriting on infinite 1-dimensional tapes.

                Given the existence of Hollerith machines and telephone switches, and the development of vacuum tubes and transistors, we would have ended up with working computers with or without Turing. The landscape of machine design and language design would be a bit different, but computers gonna compute.

                1. 1

                  Well I guess Minsky and co are wrong when they say it “directly led” to the results?

                  The TM formalism had very little to do with the design of real machines

                  Citation needed.

                  it definitely didn’t make them possible.

                  Of course there are many paths to achieving something, as can be seen by the aforementioned machine built by Zuse. What matters for credit is not what was possible – there are hundreds of thousands (or billions, if you prefer) of paths to achieving a goal – that is utterly irrelevant here. The actual invention of the Turing Machine is pretty widely credited within and outside the field with actually sparking off both the development and interest in computers within the Allied Forces.

                  Everyone in science stands on the shoulders of giants, which is all the claim in TFA boils down to. Yes, all those people should also be remembered, sure. However the zeitgeist prefers to remember individual figures because this is an individualist society, and because otherwise it gets rather unwieldy for anyone to actually remember.

                  The fact that the seeds were planted hundreds of years before Turing is utterly immaterial, you can say that about any technology. The seed for starting fires were planted hundreds of thousands of years before humanity existed, that doesn’t mean that humanity shouldn’t be credited for learning to start fires. Absolutely no invention can consist of a single person, it is always piecemeal.

                  1. 1

                    Citation needed.

                    Besides looking at the design of any successful 40s or 50s machine, and noting the similarities and differences with TMs on one hand and with tabulating machines and the Analytical Engine on the other? “Turing’s theory of computable functions antedated but has not much influenced the extensive actual construction of digital computers. These two aspects of theory and practice have been developed almost entirely independently of each other.” — Hao Wang, 1954.

                    Turing had an immense impact on computer science, but computer science has a rather looser connection with computer engineering and computer programming. They inform one another, but humanity is pretty damn good at blundering its way through engineering problems even when the theory is missing.

              2. [Comment removed by author]