1. 22
  1.  

  2. 18

    Correctness is not why I prefer static types; I find code with static types much easier to read.

    1. 11

      I’m going through an episode of this right now. I’ve been hacking on a popular package manager written in Ruby. I love the app, its ecosystem, and its governance and consider it to be one of the most successful software projects written in Ruby and one of the most successful open source projects for its target platform. You’ve probably guessed what it is by now.

      Ruby wasn’t my first language (PHP) or my second (Java) or my third (Smalltalk). However, Ruby was what I used outside of work for everything for about five years between when I stopped doing PHP for work and started doing… XSL, HTML, and JavaScript. While I don’t know Rails very well because my tasks with it were rarely web-oriented, I do know Ruby pretty well.

      Or at least did.

      You see, I got into Scala at the end of 2013 when I transferred departments temporarily and then eventually permanently. To my Ruby mind, Scala was the marriage of Java and Ruby with all of that Scheme goodness I’d enjoyed in college after learning Smalltalk.

      So, for the last 4+ years, my brain has been Scala. Sure, I’ve used Ruby here and there, but the tracing skills atrophied as my needs for debugging Ruby dried up. I’ve gotten really used to debugging with IntelliJ, using code completions, not needing to keep a documentation browser tab open all of the time, etc.

      These days, just tracing down what’s coming from where in Ruby is… a slog.

      I feel like a lot of the Ruby code I encounter works on assumptions and not assurances. It makes me uncomfortable now. These days, I’m doing Scala and Rust at work. To encode those assurances I fear would tremendously slow down the execution of the interpreted Ruby code and greatly reduce the simplicity therein by introducing far more repeated incantation, the kind of complexity handled at compile time in compiled languages.

      1. 7

        So weird that you’re saying this, because in another post ( http://www.drmaciver.com/2016/07/it-might-be-worth-learning-an-ml-family-language/ ), MacIver says:

        I’ve been noticing something recently when teaching Python programmers to use Hypothesis that has made me reconsider somewhat.

        That skill is this: Keeping track of what the type of the value in a variable is.

        In particular this seems to come up when the types are related but distinct.

        … this is by far the second most common basic usage error people make with it….

        … it’s definitely a common pattern. It also appears to be almost entirely absent from people who have used Haskell (and presumably other ML-family languages – any statically typed language with type-inference and a bias towards functional programming really)

        … in an ML family language, where the types are static but inferred, you are constantly playing a learning game with the compiler as your teacher (literally a typing tutor): Whenever you get this wrong, the compiler tells you immediately that you’ve done so and localises it to the place where you’ve made the mistake.

        This is probably a game worth playing.

        1. 3

          To me, this is the game played when not using an IDE with completions! To each their own – some folks prefer to memorize methods or keep documentation open – but completions allow me to work faster and not have to keep as much in my head from session to session.

      2. 9

        Yes, types actually communicate a great deal about the intentions and expected inputs and outputs of functions. If a function returns a Weight, that is a lot different than something returning a Velocity and might cause you to question the name of the function “GetVelocity”.

        1. 7

          Which is also an important part of correctness in the maintenance phase: the most expensive phase for both defects and changes.

        2. 8

          Not even the static typing advocates really believe they can write correct code with less than 100% coverage, do they?

          I certainly do. Depending on the nature of the unit tests you would otherwise write, it’s going to require a more aggressive type system than you’re used to, but this is totally within reach.

          it’s mostly the static typing people who are picking the fights

          If we consider what might motivate people to do this, the only obvious explanation is that static typing is, in fact, a lot better for a lot of things, and people who use static typing are acutely aware of this while people who use dynamic typing are only thinking “what’s the big deal?”.

          (This is also a matter of degree; if you primarily use, say, Java, you’re probably not going to appreciate static types all that much.)

          The author is using a cheap & dirty rhetorical technique, which is to make a bunch of dumb arguments that might trick some people, and then write off any criticism as “haha, just trolling!” when it doesn’t work. You can’t productively argue with someone if they’re just going to squirm their way out of any scrutiny.

          1. 2

            If we consider what might motivate people to do this, the only obvious explanation is that static typing is, in fact, a lot better for a lot of things, and people who use static typing are acutely aware of this while people who use dynamic typing are only thinking “what’s the big deal?”.

            The other obvious explanation is that static typing doesn’t help very much and people who use static typing pick fights because they’re insecure. ;)

            That’s why we can’t rely on “obvious explanations” when exploring causes, because the “obvious” explanation almost always is whatever confirms your own biases. That’s why we need rigorous, empirical evidence to match our claims. That’s one of the reasons Static vs Dynamic fights always go around in circles: we don’t actually have any good empirical evidence one way or another. It’s just people shouting “obvious” explanations at each other.

            The author is using a cheap & dirty rhetorical technique, which is to make a bunch of dumb arguments that might trick some people, and then write off any criticism as “haha, just trolling!” when it doesn’t work. You can’t productively argue with someone if they’re just going to squirm their way out of any scrutiny.

            That’s sorta the point in the article. These are cheap and dirty rhetorical techniques, but they’re the exact same techniques we use to argue whatever we do believe in. The only obvious (hah) way out is for us to do Empirical Software Engineering, which means actually objectively studying whatever it is we want to know.

            Fun story: when MacIver posted this on Twitter somebody told him “You clearly don’t understand how powerful types are, you need to check out this thing called ‘QuickCheck’…”

            1. 3

              they’re the exact same techniques we use to argue whatever we do believe in

              Where are static typing proponents saying “just trolling, bro”?

            2. 2

              if you primarily use, say, Java, you’re probably not going to appreciate static types all that much

              I think the biggest issue I see when discussing type systems is exactly this. So many people think of Java when they think of static types – but Java (outside primitives) isn’t even statically typed it just has some static helpers for it’s dynamic tag system. On top of that, Java, C, C++, etc have such un-powerful type systems that they give the whole idea a bad name due to the popularity of their bad implementations.

            3. 7

              That fast feedback that the statically typed language gave you about your code, the dynamically typed language will give you about your process.

              I’m not so sure about this one. Sure, the article admits it’s trolling a bit (and maybe this should have rant tag), but this stands out to me as a “rly?” kind of statement.

              Whether I’m using stronger type systems (C++ now) or weaker ones (Python, C previously), the process is basically the same: write code and tests, submit for review, get feedback, adjust, then deploy (which can mean various things) when there is concensus. For the most part, I’d say the process involved in software development that I have experienced has been independent of the programming language used.

              1. 1

                I felt the article was rhetorical, which is why I didn’t add the rant tag (although I nearly did…).

                Using rhetorical questions, can I believe be valuable in self-reflecting on the work you are doing, which is what I took away from the article.

                1. 2

                  Understood. It’s one of those cases where I wouldn’t take the tag away if it was there, nor would I add it if it wasn’t.

              2. 4

                They believe the arguments they put forth sure, but I wouldn’t argue they put sufficient evidence for me to do the same. After all I still write tests in statically typed software, but I don’t write tests to check that my types are sound, which I have to do all the time in javascript. Those comprise a good amount of my tests in javascript after all I do need to be sure that if my function gets an undefined that it doesn’t silently propagate undefined behavior throughout my program. In reality testing is a good practice, but in practice it’s genuinely hard to consistently follow let alone the kind of comprehensive cover we get from static analysis.

                Test-first fundamentalism is like abstinence-only sex ed: An unrealistic, ineffective morality campaign for self-loathing and shaming.— David Heinemeier Hansson (DHH) TDD is dead. Long live testing