1. 25
  1.  

  2. 24

    10 selling points, no downside, that’s fishy. No mention of the lack of static typing, for example. No mention of Kotlin or Scala. That sounds like someone who’s a bit too enthusiastic to acknowledge the shortcomings that any tool has.

    1. 11

      The lack of static typing as a downside is, and will always be, subjective. As Rich Hickey famously says, “What’s true of every bug? 1) it passed your unit tests 2) it passed the type checker.”

      Spec, as a structural testing mechanism, is I believe generally understood to be a preferable alternative to static typing, not an afterthought bolted on to appease devs that demand static typing.

      1. 10

        I don’t understand what the quote is trying to say. “Every bug” is still fewer bugs if type bugs are ruled out by a type checker.

        1. 3

          My reading is that static typing isn’t a silver bullet. That’s an over simplification. See the video below for a bit more context surrounding that particular quote, and maybe some rationale behind clojure’s approach to typing and the role that Spec plays.

          https://www.infoq.com/presentations/Simple-Made-Easy/

        2. 5

          It’s amusing to me that people accept this as a critique of static typing in general rather than a critique of certain hilariously-bad type systems like that of C or Java.

          I am trying to think of the last time I encountered a bug in a piece of software written in a language with a not-terrible static type system, but … I can’t think of one.

          1. 4

            Awesome. I’ll take your word for it. What is a language with a not-terrible type system? Haskell?

            1. 4

              Right.

              I mean obviously part of the reason I haven’t run into many bugs is that there just aren’t as many programs written using good type systems. Programs written in Haskell or OCaml or Elm still have bugs, but they only have certain classes of bugs. Just because you can’t get rid of every bug doesn’t mean it’s pointless to get rid of the ones you can, and Rich’s punchy quote seems to imply that this line of reasoning is invalid.

              1. 2

                I see what you’re saying. And I agree that, on the surface, that quote seems like it’s dumping on static typing entirely, but I don’t think that’s the case. In the talk in which Hickey drops that quote he expands a on it a bit, to a degree that my simply slapping it into a response doesn’t do justice. You’re the Leiningen guy, you know Clojure, you’re presumably familiar with the talk.

                My takeaway was that static typing, like unit tests, catches bugs. Certain classes of bugs (as you mention above). However, in some complex and complicated systems, those classes of bugs aren’t the primary concern. I’ve never worked in that kind of system, but I like what I’ve seen of clojure and I haven’t found this particular line of reasoning disagreeable.

                1. 4

                  You’re the Leiningen guy, you know Clojure, you’re presumably familiar with the talk.

                  Yep, I’ve seen the talk. It feels to me like he already decided he doesn’t like static typing because of bad experiences with Java and uses that experience to make straw-man arguments against all static type systems in general. I can imagine the existence of a system for which “bugs that no type system can catch” are the primary concern, but I’ve never encountered such a system myself. (I have worked with plenty of systems where the cost of the type system is higher than the benefit of squashing those bugs, but that’s a very different argument than Rich’s.)

                  1. 3

                    Yep, I’ve seen the talk. It feels to me like he already decided he doesn’t like static typing because of bad experiences with Java and uses that experience to make straw-man arguments against all static type systems in general

                    There seems be at least some evidence that he knows Haskell pretty well, he just doesn’t really publicize it.

                    I think it’d be really funny if he keynoted ICFP with a talk on algebraic effects and never mentions the talk ever again. Will never happen, but a guy can dream.

                    1. 3

                      I’m pretty skeptical that anyone who “knows Haskell pretty well” would produce the “Maybe Not” talk. More generally, my experience is that anyone willing to invest the time and energy into learning Haskell tends to be more tuned-in to the actual tradeoffs one makes when using the language, and it’s clear that his emotionally-framed talking points overlap very little with what actual Haskell programmers care or think about. Of course, it could be the case that he does know the language pretty well and simply talks about it like a frustrated novice to frame emotional talking points to appeal to his Clojure true-believers, but this seems far-fetched.

                      1. 3

                        IMO the “Maybe Not” talk gets more flak than it deserves. Function subtyping is a valid data representation and Maybe Int -> Int can be represented as a subtype of Int -> Maybe Int. Haskell chooses not to allow that representation, and it is a way in which the type system is arguable incomplete (in the “excludes valid programs” sense.)

                        1. 4

                          You’ll have to work hard to convince me that Rich Hickey was arguing on the level of critiquing the “function sub-typing” capabilities of Haskell’s type system vs. the more prosaic “static typing bad!” bullshit he falls back on again and again. Stated straightforwardly, his argument is basically “Maybe is bad because it means you will break the expectations of the caller when you need to introduce it to account for optionality.” And, I suppose it is in fact terrible when you don’t have a type-checker and depend on convention and discipline instead. So, Rich Hickey 1, static typing 0, I guess?

                          As to “Maybe Not” getting more flak than it deserves…yeah, we’ll have to agree to disagree. (And I’ll note here I’m really surprised that you in particular are taking this position considering how often I see you on here deeply engaging with such a broad variety of academic computer science topics, without needing to use strawmen or appeal to emotion to argue a position.)

                          For example, “Maybe/Either are not type system’s or/union type.” Okay. How do you even argue with that? I don’t even really understand what he’s trying to assert. Does he not believe the Curry-Howard correspondence is valid? For that matter, which type system? Will I get lambasted by the apologists for not understanding his super subtle point, yet again? Meh.

                          Someone who was honestly and deeply engaged with the ideas he spends so much time critiquing wouldn’t be babbling nonsense like “you would use logic to do that, you wouldn’t need some icky category language to talk about return types” or “type system gook getting into spec”…god forbid!

                          I’ll give him this though: Rich Hickey is doing a great job of convincing the people who already agree with him that static typing is bad.

                          (Edited to fix some links and a quote)

                          1. 3

                            As to “Maybe Not” getting more flak than it deserves…yeah, we’ll have to agree to disagree. (And I’ll note here I’m really surprised that you in particular are taking this position considering how often I see you on here deeply engaging with such a broad variety of academic computer science topics, without needing to use strawmen or appeal to emotion to argue a position.)

                            Thank you! I should note that just because I deeply engage with a lot of different CS topics doesn’t mean I won’t flail around like an idiot occasionally, or even often.

                            For example, “Maybe/Either are not type system’s or/union type.” Okay. How do you even argue with that? I don’t even really understand what he’s trying to assert. Does he not believe the Curry-Howard correspondence is valid? For that matter, which type system? Will I get lambasted by the apologists for not understanding his super subtle point, yet again? Meh.

                            Let me see if I can explain this in a less condescending way than the talk. Let’s take the type Maybe Int, which I’ll write here as Maybe(ℤ). From a set theory perspective, this type is the set {Just x | x ∈ ℤ} ∪ {Nothing}. There is an isomorphism from of the maybe type to the union type ℤ ∪ {Nothing}. Let’s call this latter type Opt(ℤ). Opt(ℤ) is a union type in a way that Maybe(ℤ) is not, because we have ℤ ⊆ Opt(ℤ) but not ℤ ⊆ Maybe(ℤ). 3 ∈ ℤ, 3 ∉ Maybe(ℤ). Again, we have Just 3 ∈ Maybe(ℤ), and an isomorphism that maps Just 3 ↦ 3, so in theory this isn’t a problem.

                            The problem is that Haskell’s type system makes design choices that makes that isomorphism not a valid substitution. In fact, I don’t think Haskell even has a way of represent Opt(ℤ), only its isomorphism. Which means that we can’t automatically translate between “functions that use Opt(ℤ)” and “functions that use Maybe(ℤ)”. Take the functions

                            foo :: Maybe Int -> Int
                            foo Nothing = 0
                            foo (Just x) = x*x
                            
                            -- I don't think this is possible in Haskell, just bear with me
                            bar :: Opt Int -> Int
                            foo Nothing = 0
                            foo x = x * x
                            

                            Is map foo [1..10] type-safe? Not in Haskell, because map foo has type [Maybe Int] -> [Int] and [1..10] has type [Int]. Is map bar [1..10] type-safe? In a type system that supported “proper” union types, arguably yes! ℤ ⊆ Opt(ℤ), so map bar is defined for all [Int]. So maybe types emulate useful aspects of union types but, in the Haskell type system, don’t have all the functionality you could encode in union types.

                            Now there are two common objections to this:

                            1. Haskell has very good reasons for doing things this way. This is 100% true. But it’s true because Haskell has a lot of goals with this type system and being able to encode this particular union type requires us to have proper function subtyping, which would absolutely be a nightmare to combine with everything else in Haskell. But nonetheless shows that Haskell is only exploring one part of the possible space of type systems, and there are valid things that it chooses not to represent. “Proper” union types are one of these things.
                            2. You can easily write a shim function to make map foo type safe. This is usually people’s objection to this talk. And most of the time you can do this. But this is just a form of emulation, not reproducing the core idea. It’s similar to how in OOP you can “emulate” higher-order functions with the strategy pattern. But it’s not a perfect replacement. For any given emulation I can probably construct a example where your emulation breaks down and you have to try something slightly different. Maybe it doesn’t work if I’m trying to compose a bunch of fmaps.

                            This is why I think the talk is underrated. There’s a lot of genuinely interesting ideas here, and I get the impression Rich Hickey has thought a lot of this stuff through, but I think it’s hampered him presenting this ideas to a general clojure audience and not a bunch of type-theory nerds.

                            1. 1

                              I don’t follow: map foo [1..10] wouldn’t even typecheck; it’s not even wrong to say it’s not typesafe (edit: and apologies if that’s what you meant, I don’t mean to beat you over the head with correct terminology, I just honestly didn’t get it). And while it’s well and fine that there’s an isomorphism between {Just x | x ∈ ℤ} and , it’s not clear to me what that buys you. You still have to check your values to ensure that you don’t have Nothing (or in the case of Clojure, nil), but in Haskell, because I have algebraic data types, I can build abstractions on top to eliminate boilerplate. Your Opt example doesn’t present as better or highlight the power of this isomorphism. Why do I even care that I can’t represent this isomorphism in Haskell? I’m afraid your post hasn’t clarified anything for me.

                              As far as the talk, I think that if he had some really interesting ideas to share, he’d be able to explain them to type theory nerds in the same talk he gives to his “core constituency.”

                              At this point, I have trouble considering the output of someone who, no matter how thoughtful they may be, has made it clear that they are hostile to certain ideas without justifying that hostility. There is plenty of criticism to be made of Haskell and type theory without taking on the stance he has taken, which is fundamentally “type theory and type systems and academic PLT is not worth your time to even consider, outside of this narrow range of topics.” If he was a random crank that’d be fine, but I think that because of his position, his presentations do real harm to Clojure programmers and anyone else who hears what he’s saying without having familiarity with the topics he dismisses, because it shuts them off to a world of ideas that has real utility even if it’s very much distinct from the approach he’s taken. It also poisons the well for those of us in the community who have spent serious time in these other worlds and who value them, which is a tremendous shame, because Rich Hickey does have a lot of great ideas and ways of presenting them intuitively. So regardless of how eloquently you may be able to translate from Rich Hickey-ese, what I object to fundamentally is his intellectual attitude, moreso than his ideas, many of which I agree with.

                              Thank you! I should note that just because I deeply engage with a lot of different CS topics doesn’t mean I won’t flail around like an idiot occasionally, or even often.

                              Well understood from personal experience. ;-)

                              1. 2

                                I don’t follow: map foo [1..10] wouldn’t even typecheck; it’s not even wrong to say it’s not typesafe (edit: and apologies if that’s what you meant, I don’t mean to beat you over the head with correct terminology, I just honestly didn’t get it).

                                That’s what I meant, it wouldn’t typecheck. Brain fart on my part.

                                And while it’s well and fine that there’s an isomorphism between {Just x | x ∈ ℤ} and ℤ, it’s not clear to me what that buys you. You still have to check your values to ensure that you don’t have Nothing (or in the case of Clojure, nil) … Your Opt example doesn’t present as better or highlight the power of this isomorphism.

                                Let’s try a different tack. So far we have

                                foo :: Maybe Int -> Int
                                bar :: Opt Int -> Int
                                

                                Now I give you three black-box functions:

                                aleph :: Int -> Maybe Int
                                beis :: Int -> Opt Int
                                gimmel :: Int -> Int
                                

                                foo aleph typechecks, as does bar beis. foo gimmel doesn’t typecheck. I think all three of those we can agree on. Here’s the question: what about bar gimmel? In Haskell that wouldn’t typecheck. However, we know that Int ⊆ Opt Int. gimmel’s codomain is a subset of bar’s domain. So bar must be defined for every possible output of gimmel, meaning that bar gimmel cannot cause a type error.

                                This means that because Haskell cannot represent this isomorphism, there exists functions that mathematically compose with each other but cannot be composed in Haskell.

                                Why do I even care that I can’t represent this isomorphism in Haskell?

                                Mostly this is a special case of function subtyping, which I don’t think Haskell supports at all? So if function subtyping makes your problem domain more elegant, it’d require workarounds here.

                                1. 1

                                  To be clear I understood your point initially about function subtyping being a thing that maybe Haskell can’t represent, and I apologize for making you come up with multiple creative examples to try to illustrate the idea (but I appreciate it)!

                                  So if function subtyping makes your problem domain more elegant, it’d require workarounds here.

                                  What remains unclear for me–if we’re treating this as a proxy for Rich Hickey’s argument–is how this demonstrates the practical insufficiency of Maybe, which is his main pitch. I am happy to acknowledge that there are all kinds of limitations to Haskell’s type system, this is no surprise. What I don’t yet understand is why this is a problem wrt Maybe!

                                  In any case, thank you for the thoughtful responses.

          2. 3

            Static typing, in my uninformed opinion, is less about catching bugs than it is about enforcing invariants that, as your application grows, become the bones that keep some sort of “tensegrity” in your program.

            Without static typing your application is way more likely to collapse into a big ball of mud as it grows larger and none of the original engineers are working on it anymore. From this perspective I suppose the contract approach is largely similar in it’s anti-implosion effect.

            However, type theory offers a whole new approach to not only programming but also mathematics and I think there is a lot of benefit we still haven’t seen from developing this perspective further (something like datafun could be an interesting protobuf-esque “overlay language” for example).

            On the other hand, dynamic programming (I think) peaked with lisp, and clojure is a great example of that. A lightweight syntax that is good for small exploratory things has a lot of value and will always be useful for on the fly configuration. Doesn’t change that the underlying platform should probably be statically typed.

          3. 6

            Static typing is mentioned in section 8 in regards to the spec library, and Scala is named in the Epilogue as a mature alternative.

            1. 3

              “In this article we have listed a number of features that positively separates Clojure from the rest.” well, seems like the author thinks they address scala as well as java in the article, even though it’s only named in the previous sentence.

              Spec is no alternative to static typing, as far as I know. Isn’t it just runtime checks, and possibly a test helper? Scala and kotlin both have decent (or even advanced) type systems. I think some of the points are also advantages over kotlin and scala (repl and simplicity/stability, respectively), but the choice is not as black and white as depicted in OP.

              1. 2

                Spec isn’t static, but it can provide much of the same functionality as a type system like Scala’s, just that it does so at runtime rather than compile time. In addition, it can be used to implement design-by-contract and dependent types and it can also generate samples of the specified data for use in testing. It’s not the same but it is an alternative.

            2. 6

              Yeah #8 and #9 (and arguably #7) really are just downsides that are framed as if they were upsides. “There’s no static typing, but here’s what you can do instead!” and “The startup time is really slow; here are some ways to work around that problem!”

              1. 2

                I read #9 as ‘by default, clojure is compiled, like java. But here’s a way to get instant(-ish) startup time, which is impossible with java.’

                1. 2

                  Being compiled has nothing to do with how fast something starts; planck is compiled as well.

                  1. 1

                    clojure is compiled, like java

                    That is, clojure’s startup time characteristics are similar to java’s.

                    1. 2

                      uh … no?

                      Clojure code can’t run without loading the Clojure runtime, which is implemented as an enormous pile of Java bytecode. A “hello world” in Java only has to load a single bytecode file, whereas a comparable Clojure program will have to almost all of clojure.jar before it can even begin to do anything.

              2. 4

                That sounds like someone who’s a bit too enthusiastic to acknowledge the shortcomings that any tool has.

                Immediately after reading the article, I agreed with this comment. After re-reading it more critically, I think the issue isn’t that he is too enthusiastic, as much as that isn’t the point of this. Outside of the brief jab in the second to last sentence (“that positively separates Clojure from the rest”), to me this doesn’t read as a comparison between all the potential successors, just something aimed at getting people interested in trying Clojure.

                As someone who hasn’t written Clojure before, but really enjoys Lisp-based languages, I found it to be a helpful overview of the language. The fact that there are no negatives listed doesn’t deter me from believing the positives any more than if I was looking for a new computer and the specifications didn’t list out other companies that make a better product. It just makes me want to carry out his last sentence and see for myself:

                … the best way to learn Clojure is to actually use it.

                1. 6

                  As a fan of static typing, I would not advertise Java’s implementation of it as a feature. More of an anti-feature. It doesn’t track whether or not a field or variable pointing at a reference type can be null or not. It doesn’t support algebraic data types.

                  For contrast, I would advertise the static type systems in Kotlin and Scala as features.

                  1. 2

                    There is little difference between Java’s type system and Kotlin’s type system.

                    1. 4

                      Kotlin has literally the exact two features I just mentioned. Java does not.

                      1. 1

                        Yes, and those two features do little difference – so it’s weird saying “Java bad, Kotlin good”.

                        1. 2

                          Explicit nullability tracking is not a small deal. Never having to accidentally cause a NullPointerException again is liberating.

                          ADTs mean you can implement everything in the “make wrong states unrepresentable” paper.

                          1. 1

                            That’s a bit like saying “look how crazy fast this 486 is .. compared to a 386!”.

                      2. 2

                        Doesn’t Kotlin have algebraic data types and pattern matching?

                        1. 1

                          Yes, barely, and no.

                  2. 6

                    As an amateur Common Lisper and professional Clojurian, it bothers me when people learn Clojure, get excited about “a Lisp”, and then start touting a bunch of features that have everything to do with Clojure being functional and nothing to do with it being a Lisp*. Pure functions and immutable data structures are neat, but they’re just as neat in Haskell or Elixir or OCaml or whatever—and arguably FP in Clojure would be even more powerful with better typing.

                    It seems like CL’s focus on macros (and DSLs), multi-paradigm programming, an incredibly sophisticated object system (CLOS / the MOP), sophisticated error handling (conditions and restarts) and so forth has dropped out of current discussion. I realize half of that is CL-specific, but at the very least I wish there was more recognition of 1) why Lisps uses S-expressions and 2) the fact that Clojure is relatively unique among Lisps for using immutable data structures and defaulting to laziness.

                    * The author did mention reader conditionals, but the specific use case is supported by other languages too

                    1. 1

                      I am new to Clojure, and I can echo your sentiments. Lots of stuff about the functional part of Clojure, but not too much of the Lisp side. Do you have any links that go into the Lisp side of things?

                      1. 3

                        Standard literature is:

                        Practical Common Lisp for a general overview of CL

                        On Lisp for advanced macro hackery

                        Object-Oriented Programming in Common Lisp: A Programmer’s Guide to CLOS. You may also like diving down the C2 Wiki rabbithole.

                        The Art of the Metaobject Protocol (C2 Wiki)

                        1. 1

                          Thank you! I’ve also done some fennel-lang programming, and they seem pretty much the same besides the ecosystem.