1. 3
  1.  

  2. 6

    I really wish we had a ‘-1, terrible, terrible troll’ voting button.

    I found the rhetoric in this article incredibly annoying. I understand the point that was trying to be made, but seriously, I could barely get through it. The twisting of words to come up with some sort of inflammatory statement really grinds my gears.

    1. 3

      I came away thinking the same. There’s really no reason to inject such vitriol into an article about programming. It’s a good example of what makes parts of programmer culture unpalatable to many, to the detriment of all.

      1. 3

        This is an argument that is older (and about as useful) as vim vs emacs, where the people who engage the most aggressively typically understand that actual pros and cons of the other side very poorly.

        1. 1

          I guess anymore I just read past the troll comments in articles, and was more hoping to raise up interesting conversation. My bad.

          1. 2

            It happens! I myself had a submission here voted to -1. We’re all figuring out what the hell this place is, still. :)

          2. 1

            Bluster aside, I think the point about typed vs classified is worth some thought. It’s obvious to anyone who’s written a ruby or python C extension that every object is represented as a single type, the PyObj or ruby_thing or whatever it’s called.

            Of course, the author is an ML junkie who thinks type inference is the solution to all the “I don’t want to type so much” objections to static typing. I found it frustrating because until you learn to pattern your program into something the type inferencer expects, it will just keep beating you with the “can’t do that” stick.

            For the record, I feel static typing is much safer and leads to better programs, but still prefer Lua just because it’s easy. Like vegetables, I know it’s good for me, but I’m not going to eat it unless I have to.

          3. 4

            Tipsy commenting is probably not a good idea in general, but I feel like although this is in many ways a dismissal of dynamic languages, it reaffirms what we already know about dynamic languages being different from static languages, and codifies what makes some people feel uncomfortable about languages that doesn’t have some kind of type system.

            What we know is that dynamic languages are very permissive in some respects, but also restrictive in others. For one, in static languages, the type system is usually a tool for good. I know that I have used it as a crutch when wrestling with a particularly thorny abstraction, sort of as a substitute for ye old physics “units trick” where you can generally figure out which equations to use based on what kind of units you need to end up with in your solution. Dynamic languages don’t have any such guarantees, so you need to be able to keep track by yourself of what class you expect something to be at the end, and then make sure that nothing can screw up so that you accidentally end up invoking a method that doesn’t exist at run-time. Of course, if you really cared, you could write pre- and post-conditions for all of your methods, and if you have 100% test coverage, this will let you effectively precompute a type system, but that’s a huge amount of overhead for something that staticly typed languages give you for free, and there also isn’t any way of signaling to the interpreter or compiler for your language that you’ve ensured that your program is typesafe, so you will still have the overhead that he mentions in it.

            However, in short scripts, like write once, run once style scripts, or very small programs, where you can hold the entire program in your head, it can be nice to only have one, flexible type, that can do anything, effortlessly.