1. 31
  1. 17

    (Edit: removed about 90% of my original post b/c it was just whiny. Trying to leave the more focused comments.)

    I don’t personally feel like the problem with Haskell is the lack of documentation, even as high-quality book-length material. I tend to learn programming languages – and I’ve become at least somewhat proficient with at least 8-10 over my career, including other FP and FP-friendly ones – by reading code.

    Reading Haskell code makes me feel dumb. The use of custom operators, language extensions, and ever-changing algebras and abstractions – remember zippers? monad transformers? oh sorry, it’s free monads and optics all the way now – means I can’t jump in and have any clue what’s actually happening in most real-world Haskell code.

    It’s probably just that I’m a little too slow and a lot too impatient, but: after ~20 years of coming back to Haskell every year or two to see if I can finally get up to speed I’m more or less resigned to just never reaching that particular summit.

    Perhaps one of the books on the OP’s list would fix that. I dunno.

    1. 14

      Professional Haskeller here:

      With respect to the ever-changing abstractions comments, there’s always going to be new shiny things in blog posts. Most real applications don’t use these things and stick to the basics. “Simple Haskell” has mostly taken root in the haskell-for-business world. The basics have absolutely stuck around and stood the test of time with a few of those shiny new things making it through slowly.

      1. 11

        It would be awesome to see that subset of Haskell documented. It would be subjective so there would be different takes, and that’s ok. “Here’s a subset of haskell we’ve chosen for business app development and why we like it.”

        1. 5

          It is documented in books like ‘Haskell in Depth’ and ‘Production Haskell’. These books have chapters diving deep on solving actual problems.

        2. 3

          Interesting, I also think basic Haskell is what we should stick to, and it’s also not such a complicated language. Basically, it’s just lambda calculus with lazy evaluation and abstract data types.

          It’s a bit like in case of Lisp, where the core language is simple, but there are tons of abstractions because the core language constructs make it easy to build them and that has given Lisp a reputation of being hard to learn and maintain.

          May I ask what Haskell subset and what language extensions you use at work?

          1. 5

            It’s a lot to summarize. We have a whole style guide.

            I’ll maybe summarize with some libraries to give you a feel:

            • io-streams for streaming, unagi-chan/unagi-streams for queues
            • postgresql-simple and sqlite-simple for db
            • use good ol’ transformers, no mtl
            • maybe the one fancy thing is servant for web stuff, but servant gives you so much bang for your buck without headache that the complexity is worth it.

            I’d be happy to answer any other specific questions.

            1. 1

              Thanks, this is really informative.

      2. 7

        I especially love the second half. ‘Haskell books that don’t exist but should’ is a nifty format to discuss an ecosystem; the little blurbs the author wrote do a good job of making you want those books; and you can even discover some existing gems, because every imagined book has some links to real books that cover the topic, just not in Haskell.

        1. 11

          Writing these book is fundamentally incompatible with Haskell’s approach of constantly making breaking changes to everything. Books take time to write. On the time scale of 1-2 years. The reality of the Haskell world is that the contents of the resulting book would be so woefully out of date as to be useless.

          I say this sitting on most of a book on modern ML in Haskell.

          It takes a lot of time to explain something coherently, make examples, describe the logic behind how a system is designed, etc. How can you possibly do that if everything constantly changes? I can either write materials to explain things in Haskell, where everything will be out of date within a year, or I can explain things in Python, where years later I don’t need to revise the materials.

          Take ‘Haskell in Depth’ published in 2021. People literally cannot run the code months later: https://github.com/bravit/hid-examples/issues/10 That’s absurd. Writing niche books isn’t particularly lucrative anyway, having to constantly rewrite the same book is borderline madness.

          The best example of how impossible it is to write such materials is Haskell itself. There is no document explaining the Haskell type system and how to perform inference in it. At best, there are some outdated papers going back many years that each sort of describe pieces of the system.

          1. 7

            I’ve been writing Haskell for 15+ years and not yet encourtered a serious breaking change.

            1. 5

              “Serious breaking changes” aren’t the biggest issue at all. Small constant breaking changes to the compiler, to language, to core libraries, and to the entire ecosystem in general are the issue. Seriously, that book is barely out and the code doesn’t work anymore.

              Every change that breaks the API of a package, a commandline flag, etc. you need to track all of those down in hundreds pages of material. In hundreds of examples. Constantly. It’s hard enough to keep up with changes to code that compiles. If I did this for a book on ML it would be my full-time job.

              1. 6

                Seriously, that book is barely out and the code doesn’t work anymore.

                Do you mean doesn’t work on the latest GHC with bleeding edge packages from Cabal? Maybe I’m insulated from this a bit by sticking to the version of GHC in Debian stable and avoiding new versions of packages when I don’t need them.

                1. 2

                  It’s the opposite problem. The published code doesn’t build using the versions it states.

                  The repo’s stack.yaml uses resolver: lts-17.4 and the linked issue’s solution is to upgrade to lts-18.17. and upgrade dependency versions.

                  1. 1

                    I’m not very familiar with how stack works. Does this mean that stack allowed breaking changes inside an existing release, or what exactly is the actual issue?

              2. 0

                this is false

              3. 6

                People literally cannot run the code months later

                This seems unlikely, as both stack and cabal are fully reproducible[1]. One wouldn’t expect a correct build description to stop working because new versions of packages and the compiler are released. Perhaps there’s just a bug in the book?

                [1] Ok, not Nix level reproducible!

                1. 4

                  There is no document explaining the Haskell type system and how to perform inference in it.

                  The Haskell Report and the GHC User’s Guide should fully describe the behaviour of the type system. If you mean there’s no single document explaining the implementation of the type system then you may be right, but is there such a document for similar compilers such as OCaml, Scala, F#? Perhaps Rust has one because it is much newer, but I’m not sure.

                  1. 3

                    The Haskell Report describes a 20 year old version of the language. The GHC user guide is a vague overview, it’s completely impossible to implement anything from the details provided in the user guide.

                    It’s not a matter of describing the current implementation. There is no document that describes the type system as it exists today at all. As in, there is no mathematical description of what Haskell is and how you are supposed to perform type inference. The best that exists is a 10+ year old paper (the OutsideIn paper) but that’s incredibly outdated, you couldn’t type check common Haskell code used today with it, and it’s not even clear it ever corresponded to any version Haskell precisely.

                    There are good reasons why this is. It takes a lot of time to write such documents. But if the language developers can’t keep up with the language themselves, it’s hard to imagine that others will be able to do so.

                    For what it’s worth, OCaml is very clearly described mathematically in plenty of places, even in textbooks. My understanding about the situation in Scala is not just that the type system is described accurately, it’s actually been machine checked. I’m least familiar with the situation in F#, but it’s in the OCaml family. There probably aren’t any surprises there.

                    1. 3

                      The Haskell Report describes a 20 year old version of the language.

                      12 year old version, let’s not over age ourselves :)

                      There has been no new version of Haskell since 2010. Some consider that an issue, and it may well be, but until there is a new one the fact that it is old does not make it wrong.

                      1. 1

                        It’s more like 24 years actually.

                        Haskell 2010 is not Haskell as it was in 2010. It’s a minor update to Haskell 98 because already no one could keep up with the changes.

                        It’s all in the first few paragraphs of the report where they describe this problem. Then they say this is an incremental and conservative update that only adds a few minor fixes and features to 98.

                        1. 1

                          Haskell 2010 defines what Haskell is it’s not descriptive of some mystical “Haskell” that may exist somewhere else that it incompletely describes. It’s a definition.

                          Standards being conservative is good. Can you imagine if every crazy language extension in GHC became part of Haskell? Some of them are even in competition or contradictory!

                          1. 2

                            You haven’t read the report. You should. It literally starts by saying it doesn’t define what Haskell is in 2010.

                            1. 1

                              I linked the Haskell 2010 Report in a sibling thread. I don’t see where it says it doesn’t define what Haskell is in 2010. Could you please point it out?

                              1. 5

                                I linked the Haskell 2010 Report in a sibling thread. I don’t see where it says it doesn’t define what Haskell is in 2010. Could you please point it out?

                                page xiv says the language has grown so much and the effort to document it is now so high, that this is going to be a small incremental update with more to come. More did not come, the documentation burden was so high everyone gave up. And the update was indeed very very small, covering only 3 major changes: FFI, pattern guards, and hierarchical module names. I pasted the contents below.

                                For reference, GHC 7 came out in 2010.

                                Even at the time in 2009 people were wondering what is up, because bread and butter parts of the language weren’t going to be included in Haskell 2020. For example, no GADTs, no associated types, no rank n types, etc. Here is someone in 2009 asking about this and getting the response that, no, this doesn’t reflect the language, but it’s the best anyone can do because keeping up with the language is so hard: https://mail.haskell.org/pipermail/haskell-prime/2009-July/002817.html The main barrier to entry is if anyone can describe that extension faithfully, but no one could.

                                Sadly, most of the archives of the haskell-prime effort seem to have been lost.

                                In any case. This is not a criticism of the Haskell2010 authors. They did their best. But, it’s important that the community realizes that the sorry state of the documentation, the lack of high quality materials like books, and the lack of in-depth ecosystems for areas like ML, are all a consequence of this decision to keep making breaking changes to the language, core libraries, and ecosystem as a whole.

                                At the 2005 Haskell Workshop, the consensus was that so many extensions to the official language were widely used (and supported by multiple implementations), that it was worthwhile to define another iteration of the language standard, essentially to codify (and legitimise) the status quo.

                                The Haskell Prime effort was thus conceived as a relatively conservative extension of Haskell 98, taking on board new features only where they were well understood and widely agreed upon. It too was intended to be a “stable” language, yet reflecting the considerable progress in research on language design in recent years.

                                After several years exploring the design space, it was decided that a single monolithic revision of the language was too large a task, and the best way to make progress was to evolve the language in small incremental steps, each revision integrating only a small number of well-understood extensions and changes. Haskell 2010 is the first revision to be created in this way, and new revisions are expected once per year.

                                1. 3

                                  I see. So, reflecting on what you wrote, the lack of published Haskell standard doesn’t seem to be the problem you are experiencing just the symptom. After all, Python doesn’t have a standard but you state that it would be a fine target for writing reference materials.

                                  I can think of only one change to the language that has caused me frustration: simplified subsumption. There have been a few frustrating library-level changes too.

                                  Could you elaborate on which changes in the Haskell ecosystem have led to concrete problems for you? Firstly, I may be able to offer advice or a fix. Secondly, I may actually be able to tackle the root cause and make future experience better for you and others.

                      2. 2

                        I am highly sceptical that there is a mathematical description of OCaml or Scala that matches how the language is used in production today. I could easily be wrong, because it’s not my area of expertise, but I know for sure that those languages move, albeit slower than Haskell, and I doubt that any published material can keep up with how those compilers develop in practice. Someone was telling me recently he is adding algebraic effects to the OCaml compiler! It seems unlikely that is in the textbooks that you mentioned.

                        I would be grateful for any material you can link me to that elaborates on the difference between Haskell and the other languages in this regard.

                        That said, this is getting somewhat removed from your original comment, because if you stick to the well-trodden parts of the type system, Haskell2010 certainly, and what is becoming GHC2021 most likely, then none of the breakage to the book you are writing will be to do with the type system per se.

                        1. 1

                          For what it’s worth, OCaml is very clearly described mathematically in plenty of places, even in textbooks. My understanding about the situation in Scala is not just that the type system is described accurately, it’s actually been machine checked. I’m least familiar with the situation in F#, but it’s in the OCaml family. There probably aren’t any surprises there.

                          Are you confusing OCaml with Standard ML?

                          Scala 3’s core language (the DOT calculus) has been specified and verified in Coq, but not the surface language as far as I’m aware. I’m also not sure if the implementation is based on elaboration into the core language (this is generally the approach that dependently typed languages use to improve confidence in the implementation).

                    2. 5

                      Wow, this is great. I got through reading the post and thought it was done, only realizing that was just getting started. Impressive efforts to elucidate these “voids”.