1. 20

  2. 3

    It’s easy to rant about programming languages, but this time the speaker digs deeper to explain the design decisions (which are typically simple and make sense from another perspective) that cause the silly edge cases.

    1. 1

      I agree. In a way or in another, each language evolves some weird trait. After all, even humans have a stem of tail in their back, or the genetic code which allows to grow gills.

      That said, the weirdness of a language being mentioned in the language specification does not make up for the fact that the language behaves surprisingly. I can not name many languages that do not have weird situations going on. Some however are definitely more prone to it.

      1. 3

        Maybe more effort should be spent to build languages that don’t have weird situations going on.

        1. 2

          It’s not like language designers want their languages to be weird. It’s just once a language is larger than brainfuck, you get conflicting requirements and have to choose trade-offs.

          e.g. if you don’t have implicit conversions, users complain they can’t “just” do “simple” things. If you add limited conversions, users complain it’s inconsistent and stops working in arbitrary situations. You make it generic, you get “wat” edge cases.

          1. 1

            Like forth or modula-3, perhaps.

            I like(d) modula-3. One its explicit goals was to have a fifty-page specification. They didn’t quite manage that, I think the final text was almost 52 pages. 52 is still impressively simple (the language is a safe object-oriented language a bit like Java, whose spec is ten times as long) and I didn’t notice any wats at all. But the language is almost unused… can a connection be drawn between those two dots?

            I think yes. The designers of a language have to focus on something, and the only way to get adoption is to focus consciously on things they believe will bring adoption.

            For example: The type of null in java is a bit of a wat, but you can work with java for years and never think about it, so the wat won’t hurt adoption. Contrast that with java’s VM, which allowed people to rant and rave about portability, and drive fanboyism and adoption. The designers could have spent time thinking about a cleaner type system, they did spend that time thinking about something that drove adoption.