1. 8
    1. 26

      Most absurd premise for an article that immediately goes back to the beginning of computing history, rockets to the present and makes an absurd unsubstantiated claim. Firstly there’s a false dichotomy, and then a reductionist view on TDD and types in general. This article’s most interesting point is the history lesson which the author claims is abbreviated (highly) and apocryphal.

      1. 27

        Yeah, I’d prefer if Robert Martin’s content didn’t appear on Lobsters. I’ve never learned anything useful from it; it’s frequently shallow or wrong, and almost always inflammatory, with a heap of TDD zealotry thrown in.

        1. 8

          My biggest issue with Uncle Bob is that he seems to be stuck in a business environment divorced from what most front-line developers (especially in startups) are dealing with today, and his standards are quite conservative.

          I think his practices do result in reliable software, but I think he vastly overvalues the importance of software quality.

          1. 25

            I don’t think he overvalues the importance of software quality so much as he just has a completely one dimensional view of how software can be developed and how quality is achieved (TDD! TDD! TDD!). He doesn’t seem interested in any genuine exploration of software quality, the different ways it could be managed and the implications it has for software design and construction, to name a few things. His thing is proselytising.

            1. 8

              I don’t think he overvalues the importance of software quality so much as he just has a completely one dimensional view of how software can be developed and how quality is achieved (TDD! TDD! TDD!).

              To a man with a solution looking for a problem, every problem seems to fit. (With apologies to Maslow.)

              His thing is proselytising.

              Indeed! (I learnt a new word.)

          2. 10

            I think his practices do result in reliable software, but I think he vastly overvalues the importance of software quality.

            It’s possible to write quality software in Python or Ruby. In fact, Clojure as a community is very quality-focused although the language is dynamically typed. The problem is that most software managers won’t allow the time that it takes to write good software.

            A truly statically typed language (Java is a weird hybrid) makes it harder to cut corners. Give Haskell to your average open-plan Scrum drone (whose productivity is often negative in the long term) and he won’t be able to get anything to compile and he’ll get angry and complain about it to his boss.

            This is why purists love static typing. Yes, most of us are good engineers who can write good code in any language. We just like languages that won’t let our managers cut corners. Of course, this is also why most employers don’t like strong typing.

            1. 4

              We just like languages that won’t let our managers cut corners

              There is no programming language that can overcome team limitations. Some things can be minimized, others emphasized. But for the foreseeable future it will remain much easier for a project to go bad than to go well.

              Teams that select languages for effectiveness are usually going to be effective in other aspects of teamwork. The team is as much or more to credit than the language they selected.

        2. 5

          I can’t believe I’m defending this article, but it’s way more technical than the usual fawning over the flavor of the month JS stuff that crops up from time to time.

      2. [Comment removed by author]

        1. 5

          I’m not making the compiler god happy, I’m wiring up the compiler to do domain specific checking on my behalf.

          I don’t think this can be emphasized enough. The compiler and language are just tools. You can use a hammer to drive nails into wood, or you can use it to hit your thumb over and over again and cry about how terrible hammers are.

          Granted a reasonably expressive statically typed language, when code fails to compile because it doesn’t typecheck, a lot of people are way too quick to turn their compiler into an adversary. The (largely irrelevant) canard about correct programs expressible in a statically typed language being a strict subset of correct programs expressible in a unityped language aside, 99.9% of the time the onus is on the programmer to fix their incorrect modeling or erroneous logic, not the language to put away the whip and ball gag.

    2. 19

      And so, Java, and it’s bastard cousin C# became the languages of the internet; and held sway for two decades. But there was a lot going on behind the scenes.

      That’s not…it doesn’t even…goddamnit Uncle Bob.

      EDIT: For folks playing the home game, the answer here would’ve been Perl/PHP/Javascript.

    3. 18

      You see, when a Java programmer gets used to TDD, they start asking themselves a very important question: “Why am I wasting time satisfying the type constraints of Java when my unit tests are already checking everything?”

      That was not my experience. Rather, I often thought: “why am I wasting my time writing tests for this, which could be avoided if the language had better type constraints?”

      And that’s exactly what happened in second half of the first decade of the current millennium. Tons of Java programmers jumped ship and became dedicated dynamic typers, and TDDers.

      This seems unsubstantiated, IMO. Bob implies, IMO, that all these Java developers “left” Java for dynamic languages, in order to become TDD developers which seems to me just a teensy bit absurd.

      How will this all end?

      My own prediction is that TDD is the deciding factor.

      IMNSHO that is Uncle Bob’s prediction because Uncle Bob’s income relies on harping on about TDD like a broken record. Often nonsensically—like in this post.

      You don’t need static type checking if you have 100% unit test coverage.

      This is just plain false. Making sure your function has 100% unit test coverage does nothing to help you if someone calls your function with unexpectedly typed data. Say your function expects two integers, but are called with a dictionary and a list. Or another function. Sure, you can add type checks to your function, and more unit tests to verify that those type checks catch the unexpected types, but that’s a lot of boiler plate to add everywhere.

      “Oh, but my function is not useful enough to be included in a library used by others, and I do have 100% coverage of my app, so I assert that it is not called with unexpected types.” Right. Does your app get data from anywhere? Input from a form? Environment variables that people can set before running your app? Are you sure that the input to the function does not come from inadequately sanitised data? If you were using Perl you could activate taint mode, but it looks like the equivalent in Python was rejected in 2009. I don’t know if Ruby has anything equivalent. And even if you are using Perl, there’s a lot of ways your function can fail that taint mode won’t help you catch.

      1. [Comment removed by author]

        1. 11

          It’s meaningless even in C.

          char buf[8];
          gets(buf);
          

          I can “100%” test that with a single case of “hello”, but good god, I’m not going to ship it.

          1. 5

            Indeed, great example. I think this is a great demonstration of why I’m really enjoying learning my ML/Haskell/Idris/Agda stuff. Understanding the domains and codomains (all possible inputs/outputs) of a function and being able to control them both to have the type checker deal with the complexity is the exact thing I cannot see test driven development easily handling.

            I associate test driven development with establishing: For some x there exists a response y.

            Versus there being other ways of establishing a forall x there exists … proofs

        2. 4

          The first question is “coverage of what??” Line coverage is next to meaningless, branch coverage is a bit better, but requires more complex tools, more test code, and still cannot prove your software correct. State space coverage could be closer to make that sentence true, but then you are in the world of formal proofs, which I infer wouldn’t be of Bob’s liking.

          There are of course many more objections one can list for this very assertion, but I don’t think is even worth it.

          It is funny that Bob calls Swift’s type algebra opinions and then drops this unsubstantial lines as facts. I think he got some concepts reversed.

        3. 1

          My favorite version of this is:

          Tests : Types :: ∃ : ∀

      2. 1

        Ruby does have taint mode; I’ve never seen anyone ever use it.

    4. 14

      Learning swift has been an interesting experience. The language is very opinionated about type safety. Indeed, I don’t remember using a language that was quite so assiduous about types.

      It is weird for the author to state up front that he doesn’t really have any experience in the subject at hand, and then make a bold prediction like “dynamic typing will win”

    5. 13

      You don’t need static type checking if you have 100% unit test coverage.

      So much years behind him, yet he seems to have learned very little about type systems. For example, a good type system allows a programmer to design structures that make illegal states unrepresentable; taking advantage of this capability is better than tests because the compiler ensures that the invariants are held everywhere in the program and that you haven’t missed one in a weird control path of an obscure module.

    6. 8

      I laughed so hard when I read this line.

      “The language is very opinionated about type safety. Indeed, I don’t remember using a language that was quite so assiduous about types.” This was about optional types.

      Huh! Funny, considering he thought himself an expert enough to write THIS article 2 years ago: http://blog.cleancoder.com/uncle-bob/2014/11/24/FPvsOO.html

      How the hell does he think he’s smart enough to know about the benefits of FP with static types when he hasn’t used any variations of the statically typed functional language ML, which has dominated this space since the 70s (practically 50 years!!!). This is such astounding proof to me that Uncle Bob is a hypocrite and a charlatan, and you should almost never trust anyone who says they have the right answers.

    7. 6

      This post is pure applesauce.

      The language is very opinionated about type safety. Indeed, I don’t remember using a language that was quite so assiduous about types.

      I dislike the usage of “opinionated” here. How is having a particular model for types more opinionated than one that has no model for types? Opinions are just views or judgments, a type system is a mathematical model. It’s like saying addition is opinionated because it is only defined for numbers.

      The extreme nature of the type system in swift got me thinking about the “Type Wars” that our industry has been fighting for the last six decades. And that got me to thinking that I should blog about the history of those wars, and then predict the future outcome.

      If Swift is the most typed language Robert Martin has worked in, then I am not convinced he knows enough to have a history of type systems.

      WARNING: This history has many omissions and contains much that is apocryphal, or at least wildly inaccurate.

      That’s an understatement. Why even write this post then? Usually Martin doesn’t have anything of value to say, and this time he actually admits it before the content.

      Apparently many programmers agreed with me because C won that language war; and won it decisively.

      C did win, although there are several other possibilities for why it won than because people had the same view of Pascal as Martin did. My guess is it won because of UNIX.

      The late ‘80s and early '90s were a kind of “cold-war” between the static type-checking of C++ and the dynamic type checking of Smalltalk. Other languages rose and fell during this time; but those two broad typing categories pretty well defined them.

      Oh? That’s a pretty narrow view. SML and Haskell were both created in the 90’s. A huge amount of research went into statically typed languages in the 90’s. Not to mention StrongTalk, the statically typed Smalltalk that became Java. And let’s not forget Perl, which had a hugely successful 90’s (at least the second half).

      And so, Java, and it’s bastard cousin C# became the languages of the internet; and held sway for two decades. But there was a lot going on behind the scenes.

      What? Java has a front-end language (applets) lasted only a few years as a serious option, replaced by a dynamically typed language called JavaScript. Now Java does hold a lot in the backend of the internet, though. But C#? Certainly places are using C# for backend, but “the language of the internet” is nonsense.

      You see, when a Java programmer gets used to TDD, they start asking themselves a very important question: “Why am I wasting time satisfying the type constraints of Java when my unit tests are already checking everything?” Those programmers begin to realize that they could become much more productive by switching to a dynamically typed language like Ruby or Python.

      In my experience, they ask themselves “Why do I need TDD if I can have the compiler do it for me?” IME, many companies are moving towards Go or Java, away from Ruby or Python. This paragraph is really just Martin either trying to sell snack-oil or evidence of how out of how out of touch with reality he is.

      Tons of Java programmers jumped ship and became dedicated dynamic typers, and TDDers. That ship-jumping continues to this day, spurred on by the fact that salaries tend to be higher for programmers of the dynamic languages.

      This needs to be substantiated, based on what I’ve seen at every company I’ve worked at, this is fantasy.

      Oh, and I should tell you about one special unit of Smalltalk programmers who stayed at IBM planned their revenge against Sun. They executed that revenge by creating … Eclipse.

      I’m not sure if this is meant to be satire or not. Eclipse did what, exactly, to Sun? Made a popular, free, IDE for Java allowing adoption? Or was Eclipse revenge on me, the hapless programmer, for using Java?

      And so here we are. The pendulum is quickly swinging towards dynamic typing. Programmers are leaving the statically typed languages like C++, Java, and C# in favor of the dynamically typed languages like Ruby and Python.

      Perhaps if this blog post was written 10 years ago this would be true. But I’d also be surprised if people who were using C# were leaving it for either of those options.

      My own prediction is that TDD is the deciding factor. You don’t need static type checking if you have 100% unit test coverage. And, as we have repeatedly seen, unit test coverage close to 100% can, and is, being achieved. What’s more, the benefits of that achievement are enormous.

      And I don’t need 100% test coverage if I have static type checking. And 100% test coverage doesn’t tell you as much as you’d like to think. What about external inputs? As @tedu pointed out, gets will work well for some input and not for others, but it’ll still look like 100% test coverage to your program.

      Therefore, I predict, that as TDD becomes ever more accepted as a necessary professional discipline, dynamic languages will become the preferred languages. The Smalltalkers will, eventually, win.

      A pyrrhic victory, if the prediction does hold up, at best.

      1. 5

        You see, when a Java programmer gets used to TDD, they start asking themselves a very important question: “Why am I wasting time satisfying the type constraints of Java when my unit tests are already checking everything?” Those programmers begin to realize that they could become much more productive by switching to a dynamically typed language like Ruby or Python.

        In my experience, they ask themselves “Why do I need TDD if I can have the compiler do it for me?” IME, many companies are moving towards Go or Java, away from Ruby or Python. This paragraph is really just Martin either trying to sell snack-oil or evidence of how out of how out of touch with reality he is.

        Exactly; “why do I need to test for null again, and again, and again, even if it makes no sense, when I could instead just add a question mark to my type and throw away hundreds of unnecessary tests?”

    8. 1

      Apologies in advance for the meta-question: given how playful and refreshingly simple the language is, I would expect Smalltalk advocates to be cheerful, happy people. But my experience online has been quite the opposite: lots of grandiose promises and barely concealed hostility. I wonder why that is?

      1. 5

        For the record, Robert Martin has never been a Smalltalk programmer. He may have tried it once or twice.

        1. 1

          Good to know, thanks. I’ve seen his articles pop up every so often but I don’t really know anything about him professionally, I admit.