GHC2021 is a basket of language extensions which are useful and less controversial. New compilers could possibly target GHC2021 instead of falling prey to the author’s concern:
As long as Haskell is defined implicitly by its implementation in GHC, no other implementation has a chance — they will always be forever playing catch-up.
Again, whether it be Haskell98 or Haskell2010 or GHC2021, new compilers don’t have to implement every research extension explored by the GHC folks. I think the concern is overplayed.
Lacking a written standard, whatever GHC does has become the de facto definition of Haskell.
What that means depends on who you ask. Some people are happy with this situation. It allows them to make changes to the language that move it forward without a standard to get in the way.
Others are very unhappy. Haskell has been making constant breaking changes lately.
In my opinion the situation is disastrous and after using Haskell for over a decade, as of last year I no longer build any production system in Haskell. Keeping a system going is just a constant war against churn that wastes an incredible amount of time. Any time you might have saved by running Haskell, you will return because of the constant wastage.
What’s worse is that the changes being made are minor improvements whose fix don’t address any of the issues people have with the language.
It’s not just the core libraries that have this total disregard for the pain they inflict on users with large code bases. This message that breaking everything is a good idea has proliferated throughout the community. Libraries break APIs with a kind of wild abandon that I don’t see in any other language. The API of the compiler changes constantly to the point where tooling is 2+ years out of date. The Haskell Language Server still doesn’t have full 9.0 support 2 years later!
Haskell is a great language, but it’s being driven into the ground by a core of contributors who just don’t care about the experience of a lot of users.
HLS has supported 9.0 since July 2021, it recently gained support for 9.2.1 as well.
Keeping a system going is just a constant war against churn that wastes an incredible amount of time.
Are we really talking about the same language? I’m working full time on a >60k line Haskell codebase with dependencies on countless packages from Hackage, and none of the compiler version upgrades took me longer than half a day so far.
Now, don’t get me wrong, the churn is real. Library authors particularly get hit the worst and I hope the situation gets better for them ASAP, but I really think the impression you’re giving is exaggerated for an application developer hopping from stackage lts to stackage lts.
HLS has supported 9.0 since July 2021, it recently gained support for 9.2.1 as well.
HLS had partial support for 9.0. And even that took more than half a year. The tactics plugin wasn’t supported until a few months ago. And stylish-haskell support still doesn’t exist for 9.0
I’m working full time on a >60k line Haskell codebase with dependencies on countless packages from Hackage, and none of the compiler version upgrades took me longer than half a day so far.
60k is small. That’s one package. I have an entire ecosystem for ML/AI/robotics/neuroscience that’s 10x bigger. Upgrades and breakage are extremely expensive, they take days to resolve.
In Python, JS, or C++ I have no trouble maintaining large code bases. In Haskell, it’s an unmitigated disaster.
Saying you have “no trouble maintaining larger codebases” with Python or JS seems a bit suspicious to me….
I am personally also a bit in the “things shouldn’t break every time” (like half a day’s work for every compiler release seems like a lot!) but There are a lot of difficulties with Python and JS in particular because API changes can go completely unnoticed without proper testing. Though this is perhaps way less of an issue if you aren’t heavy dep users.
py-redis released 2.0, Pipfile had “*” version, pipenv install auto-upgraded py-redis and bam, incompatible API. Larger the codebase, more frequently it happens.
Meanwhile, some C++/SDL code I committed to sf 20 years ago still compiles and runs fine.
I’ve worked on quite large Haskell codebases too, and cannot say that I’ve had any of the experiences you have - I’m sure you have but it’s not something that the community is shouting from the rooftops as being a massive issue like you’re claiming, and it might have much more to do with the libraries you rely on than GHC itself. This just comes across as FUD to me, and if someone told me Join Harrop wrote it, I would believe it.
it’s not something that the community is shouting from the rooftops as being a massive issue like you’re claiming …. This just comes across as FUD to me, and if someone told me Join Harrop wrote it, I would believe it.
Well, any amount of googling will show you many complaints about this. But I’ll pick the most extreme example. The person who runs stack (one of the two package managers) put his Haskell work on maintenance mode and is moving on to Rust because of the constant churn. https://twitter.com/snoyberg/status/1459118086909476879
“I’ve essentially switched my Haskell work into maintenance mode, since that’s all I can be bothered with now. Each new thing I develop has an immeasurable future cost of maintaining compatibility with the arbitrary changes coming down the pipeline from upstream.”
“Today, working on Haskell is like running on a treadmill. You can run as fast as you’d like, but the ground is always moving out from under you. Haskell’s an amazing language that could be doing greater things, and is being held back by this.”
I can’t imagine a worse sign for the language when these changes drive away the person who has likely done more than anyone to promote Haskell adoption in industry.
Don’t forget the creation of a working group specifically on this topic. It still remains to be seen if they have the right temperament to make the necessary changes.
The person who runs stack has been a constant source of really great libraries but also really painful conflict in the Haskell community. His choice to vocally leave the community (or something) is another example of his habit of creating derision. Whether it’s reflective of anything in the community or not is kind of pointless to ask: It feels like running a treadmill to him because he maintains way too many libraries. That load would burn out any maintainer, regardless of the tools. I feel for him, and I’m grateful to him, but I’m also really tired of hearing him blame people in the Haskell community for not doing everything he says.
Snoyman somewhat intentionally made a lot of people angry over many things, and chose not to work with the rest of the ecosystem. Stack had its place but cabal has reached a state where it’s as useful, baring the addition of LTS’, which have limited value if you are able to lock library versions in a project. While He may have done a lot to promote Haskell in industry, I know a lot of people using Haskell in industry, and very few of them actually use much of the Snoymanverse in production environments (conduit is probably the main exception, because http-conduit is the best package for working with streaming http data, and as such many other packages like Amazonka rely on it). I don’t know any people using Yesod, and many people have been burned by persistent’s magic leading to difficulties in maintenance down the road. I say all this as someone who recommended stack over cabal quite strongly because the workflow of developing apps (and not libraries) was much more pleasant with stack; but this is no longer true.
As someone who’s been using Haskell for well over a decade, the last few years have been fantastic in the pace of improvements in GHC. Yes some things are breaking, but this is the cost of paying off long held technical debt in the compiler. When GHC was stagnating, things were also not good, and I would prefer to see people attempting to fix important things while breaking some others than seeing no progress at all. The Haskell community is small, we don’t have large organisations providing financial backing to work on things like strong backwards compatibility, and this is the cost we have to pay because of that. It’s not ideal, but without others contributing resources, I’ll take positive progress in GHC over backwards compatibility any day (and even on that front, things have improved a lot, we used to never get point releases of previous compilers when a new major version had been released).
Maybe for you. For me, and many other people who are trying to raise the alarm about this, changes like this cause an endless list of problems that are simply crushing.
It’s easy to say “Oh, removing this method from Eq doesn’t matter”. Well, when you depend on a huge number of libraries, it matters. Even fixing small things like this across a large code base takes time. But now I can’t downgrade compilers anymore unless I sprinkle ifdefs everywhere, so I need to CI against multiple compilers which makes everything far slower (it’s not unusual for large Haskell projects to have to CI against 4+ versions of GHC, that’s absurd). And do you know how annoying it is to have a commit and go through 3 different versions before you finally have the ifdefs right for all of the GHC variants you CI against?
Even basic things like git bisect are broken by these changes. I can’t easily look in the history of my project to figure out what’s going wrong. To top it all off I now need my own branches of other libraries I depend on who haven’t upgraded yet. This amounts to dozens of libraries. It also means that I need to upgrade in lockstep, because I can’t mix GHC versions. That makes deployments far harder. It’s also unpleasant to spend 10 hours upgrading, just to discover that somethings fundamental prevents you from switching, like a bug in GHC (I have never seen a buggier compiler of a mainstream language) or say, a tool like IHaskell not being updated or suffering from bugs on the latest GHC. I could go on.
Oh, and don’t forget how because of this disaster you need to perfectly match the version of your tools to your compiler. Have an HLS binary or IHaskell binary that wasn’t compiled with your particular compile version, you get an error. That’s an incredibly unfriendly UX.
Simply put, these breaking changes have ramifications well beyond just getting your code to compile each time.
Let’s be real though, that’s not the list of changes at all. The Haskell community decided to very narrowly define what a breaking change to the point of absurdity. Breaking changes to cabal? Don’t count. Breaking changes to the GHC API? Don’t count, even though they break a lot of tooling. Even changes to parts of the GHC API that you are supposed to use as a non-developer of the compiler, like the plugin API don’t count. Breaking changes to TH? Don’t count. etc.
Usually they don’t actually break anything unless you’re using some deep hacks.
Is having notebook support for Haskell a deep back? Because GHC has broken IHaskell and caused bugs in it countless times. Notebook support is like table stakes for any non-toy language.
If even the most basic tools you need to be a programming language mean you rely on “deep hacks” so apparently you deserve to be broken, well that’s a perfect reflection of how incredibly dysfunctional Haskell has become.
Because I have no need for it. That concept doesn’t fit into any of my workflows. I generally just don’t do exploratory stuff that requires rerunning pieces of code in arbitrary order and seeing the results inline. Pretty much all the things I do require running some kind of “system” or “server” as a whole.
Thank you for you answer. I always thought that work in a statistical setting (say, pharma, or epidemics), requires a bit of explorative process in order to understand the underlying case better. Tools like SAS kind of mirror the workflow in Jupyter.
What kind of statistical processes do you work with, and what tools do you use?
I don’t! I don’t do statistics! I hate numbers! :D
I’m sorry if this wasn’t clear, but “finishing a statistics class” wasn’t meant to imply “going on to work with statistics”. It just was a mandatory class in university.
The first thing I said,
In a sciencey kind of context only. Systems, embedded, backend, GUI, game, etc. worlds generally do not care about notebooks.
was very much a “not everybody does statistics and there’s much more of the other kinds of development” thing.
Is having notebook support for Haskell a deep back? Because GHC has broken IHaskell and caused bugs in it countless times. Notebook support is like table stakes for any non-toy language.
I’ve never used a notebook in my career, so…
In any case, I think you’ve got a set of expectations for what haskell is, and that set of expectations may or may not match what the community at large needs, and you’re getting frustrated that haskell isn’t meeting your expectations. I think the best place to work that out is in the mailing lists.
They are pretty nice. Kind of like a non-linear repl with great multi line input support. It can get messy (see also non-linear), but great for hacking all kinds of stuff together quickly.
Is having notebook support for Haskell a deep back? Because GHC has broken IHaskell and caused bugs in it countless times.
The way that IHaskell is implemented, I would actually consider it a deep hack, since we poke at the internals of the GHC API in a way that amounts to a poor rewrite of ghci (source: am the current maintainer). I don’t know that it’s fair to point to this as some flaw in GHC. If we didn’t actually have to execute the code we might be able to get away with using ghc-lib or ghc-lib-parser which offers a smoother upgrade path with less C pre-processor travesties on our end.
Sure! I’m very familiar as I’ve contributed to IHaskell, we’ve spoken through github issues.
I wasn’t making a technical point about IHaskell. That was a response to the idea that some projects need to suffer because they’re considered “deep hacks”. Whatever that is. As if those projects aren’t worthy in some way.
I really appreciate the maintenance of IHaskell. But if you take a step back and look at the logs, it’s shocking how much time is spent on churn. The vast majority of commits aren’t about adding features, more stability, etc. Making IHaskell as awesome as it can be. They’re about keeping up with arbitrary changes in GHC and the ecosystem. Frankly, upstream Haskell folks are just wasting the majority of the time of everyone below them.
I can definitely relate to the exhaustion brought on by the upgrade treadmill, but nobody is forcing folks to use the latest and greatest versions of packages in the Haskell ecosystem and I also don’t think the GHC developers owe it to me to maintain backwards compatibility in the GHC API (although that would certainly make my life a little easier). A lot of the API changes are related to improvements in the codebase and new features, and I personally think the project is moving in a positive direction so I don’t agree that the Haskell folks are wasting my time.
At my current job we were quite happily using GHC 8.4 for several years until last month, when I finally merged my PR switching us over to GHC 8.10. If I hadn’t decided this was something I wanted to do we probably would have continued on 8.4 for quite a while longer. I barely had any issues with the upgrade, and most of my time was spent figuring out the correct version bounds and wrestling with cabal.
Great question! We wouldn’t be able to provide all the functionality that IHaskell does if we stuck to the hint API. To answer your question with another question: why doesn’t ghci use hint? As far as I can tell, it is because hint only provides minimal functionality around loading and executing code, whereas we want the ability to do things such as:
have common GHCi functionality like :type, :kind, :sprint, :load, etc., which are implemented in ghci but not exposed through the GHC API
transparently put code into a module, compile it, and load the compiled module for performance improvements
query Hoogle within a cell
lint the code using HLint
provide tab-completion
Arguably there is room for another Haskell kernel that does use hint and omits these features (or implements them more simply), but that would be a different point on the design space.
So far in practice updating IHaskell to support a newer version of GHC takes me about a weekend each time, which is fine by me. I even wrote a blog post about the process.
Thanks for the thoughtful and detailed response! As someone who has used IHaskell in the past, I really want it to be as stable and easy to use as any other kernel is with Jupyter.
Me too! From my perspective most of the issues I see are around installing IHaskell (since Python packaging can be challenging to navigate, and Haskell packaging can also be challenging to navigate, so doing together is especially frustrating) and after that is successfully accomplished not that many people have had problems with stability (that I am aware of from keeping an eye on the issue tracker).
Python packaging is it’s own mess so no matter what happens on the Haskell side there is likely always going to be a little pain and frustration. I was struck by your blogpost how many of the changes you made that were due to churn. Things like functions being renamed. Why couldn’t GHC people put a deprecation pragma on the old name, change its definition to be equal to the new name and go from there? It would be nice if all you needed to get 9.0 support was update the cabal file.
With the way churn happens now I wouldn’t be surprised if in a few months there is a proposal to just rename fmap to map. After all this change should save many of us a character and be simple for all maintainers to make.
You’re right that it would be nice to just update the .cabal file, but when I think about the costs and benefits of having a bunch of compatibility shims (that probably aren’t tested and add bulk to the codebase without providing additional functionality) to save me a couple of hours of work every six months I don’t really think it makes sense. It’s rare that only the names change without any other functionality changing (and in that case it’s trivial for me to write the shims myself), so deprecating things really only has the effect of kicking the can down the road, since at some point the code will probably have to be deleted anyway.
I think the larger issue here is that it’s not clear who the consumers of the GHC API are, and what their desires and requirements are as GHC continues to evolve. The GHC developers implicitly assume that only GHC consumes the API, and although that’s not true it’s close enough for me. I harbour no illusions that IHaskell is a particularly important project, and if it disappeared tomorrow I doubt it would have much impact on the vast majority of Haskell users. As someone who has made minor contributions to GHC, I’m impatient enough with the current pace of development that I would rather see even greater churn if it meant a more robust and powerful compiler with a cleaner codebase than for them to slow down to accommodate my needs as an IHaskell maintainer. It seems like they’re slowly beginning to have that conversation anyway as more valuable projects (e.g. haskell-language-server) begin to run up against the limitations of the current system.
I agree that the GHC API is generally treated as an internal argument. I also think between template-haskell, IHaskell, and the inadequacies of hint that there is a need for a more stable public API. And I think having one is a great idea. Libraries to let you more easily manipulate the language, the compiler, and the runtime are things likely to be highly valued for programming language enthusiasts. Maybe more so than the average programmer.
I don’t really want to engage with your rant here. I’m sorry you’re having so many issues with haskell, but it doesn’t reflect my experience of the ecosystem and tooling.
I don’t know what to tell you. A person took the time to explain how the tooling and ecosystem make it very hard to keep packages functioning from version to version of the main compiler. And you just offer a curt dismissal. It’s an almost herculean effort to write libraries for Haskell that work for all 8.x/9.x releases. This combined with a culture of aggressive upper-bounds on package dependencies makes it very challenging to use any libraries that are not actively maintained.
And this churn does lead to not just burnout of people in the ecosystem, but the sense that less Haskell code works every day that passes. Hell, you can’t even reliably get a binary install of the latest version of GHC which has been out for several months. The Ubuntu PPA page hasn’t been updated in a year.
Many essential projects in the Haskell ecosystem have a bus-factor of one, and it’s hard to find people to maintain these projects. The churn is rough.
I’m sorry for being dismissive. Abarbu’s response to my very short comment was overwhelming, and so I didn’t want to engage.
For upper bounds on packages I typically use nix flakes to pin the world and doJailbreak to ignore version bounds. I believe you can do the same in stack.yaml with allow-newer:true. The ecosystem tools make dealing with many issues relatively painless.
Getting a binary install of the latest version of GHC requires maintainers and people that care. But, if as abarbu says “I have never seen a buggier compiler of a mainstream language,” then I would recommend not upgrading to the latest GHC until your testing it shows that it works. If there aren’t packages released or the new version causes bugs, then why not stay on the current version?
Breaking changes in the language haven’t ever burned me. If it’s causing people problems, writing about specific issues in the Haskell mailing lists is probably the best way to get help. It has the nice side effect of teaching the GHC developers how their practices might cause problems for the community.
A lot of us are saying this stuff as people who have used the language for many years. That you need to use nix + assorted hacks for it to be usable reflects the sad state of the ecosystem. I’d go as far to say it’s inadvisable to compile any non-trivial Haskell program outside it’s own dedicated nix environment. This further complicates using programs written in Haskell, nevermind packaging them for external users. I have had ghci refuse to run because somehow I ended with a piece of code that depended on multiple versions of some core library.
It’s a great language, but the culture has lead to an ecosystem that is rough to work with. An ecosystem that requires lots of external tooling to use productively. I could complain about the bugginess of GHC and how the compiler has been slower every release for as long as I can remember, but that misses the real pain point. The major pain point is that GHC team doesn’t value backwards compatibility, proper deprecation capabilities, or even tooling to make upgrading less painful. Their indifference negatively affects everyone downstream that has to waste time on pointless maintenance tasks instead of making new features.
For context, I started learning haskell about 11 years ago and have been using it extensively for about 7 years. I started when cabal hell was a constant threat, and if you lost your development environment you’d never compile that code again.
From my perspective, everything is much better now. Nix pinning + overrides and Stack resolvers + extra-deps are great tools to construct and remember build plans, and I’m sure Cabal has grown some feature along with “new-build” commands to save build plans.
That you need to use nix + assorted hacks for it to be usable reflects the sad state of the ecosystem.
I think having three great tools to choose from is pretty great. The underlying problem is allowing version upper-bounds in the cabal-project file-format.
This further complicates using programs written in Haskell, nevermind packaging them for external users.
After the binary is compiled none of compilation and dependency searching problems exist. Package the binary with its dynamic libs, or produce a statically linked binary.
It’s a great language, but the culture has lead to an ecosystem that is rough to work with. An ecosystem that requires lots of external tooling to use productively.
I worked with Go for four years. When you work with go you have go-tool, go-vet, counterfeiter, go-gen, go-mod, and at least two other dependency management tools (glide? glade? I can’t remember). Nobody is complaining about there being “too many external tools” in the go ecosystem. Don’t get me started on java tooling. Since when has the existence of multiple tools to deal with dependencies and compilation been a bad signal.
The major pain point is that GHC team doesn’t value backwards compatibility, proper deprecation capabilities, or even tooling to make upgrading less painful. Their indifference negatively affects everyone downstream that has to waste time on pointless maintenance tasks instead of making new features.
This is biting the hand that feeds, or looking the gift horse in the mouth or something. The voices in the community blaming their problems on the GHC team are not helping things, imo. Sorry. There’s a lot of work to be done, and the GHC team are doing a good job. That there also exists active research going on in the compiler is unusual, but that’s not the “GHC team doesn’t value backwards compatibility” or “their indifference”, that’s them being overloaded and saying “sure, you can add that feature, just put it behind an extension because it’s not standard” and going back to fixing bugs or optimizing things.
Many of the people weighing in are not what I would call outsiders. I’ve contributed plenty to Haskell, and I complain out of a desire to see the language fix what is in my opinion one of it’s most glaring deficiencies. If I didn’t care, I’d just quietly leave the community like many already have. The thread linked above even offers some constructive solutions to the problem. Solutions like migration tools so packages can be upgraded more seamlessly. Perhaps some shim libraries full of CPP macros that lets old code keep working for more than two releases. Maybe a deprecation window for things like base that’s close to 2 years instead of one.
Like how wonderful would it be if there was a language extension like GHC2015 or GHC2020 and I could rest assured the same code would still work in 10 years.
Calling that Gish-gallop is pretty dismissive. It’s not like abarbu went off on random stuff. It’s all just about how breaking changes (or worse non breaking semantic changes) make for unpleasant churn that damages a language.
That grammatical form in the title “all hope is not lost” drives me semi crazy, because it’s a valid statement and almost certainly not what they meant, which was probably “not all hope is lost”.
cf moving universal and existential quantifiers around in boolean expressions.
So what happened next?
Next
GHC2021
happened. https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/exts/control.html#extension-GHC2021GHC2021 is a basket of language extensions which are useful and less controversial. New compilers could possibly target GHC2021 instead of falling prey to the author’s concern:
Again, whether it be Haskell98 or Haskell2010 or GHC2021, new compilers don’t have to implement every research extension explored by the GHC folks. I think the concern is overplayed.
Lacking a written standard, whatever GHC does has become the de facto definition of Haskell.
What that means depends on who you ask. Some people are happy with this situation. It allows them to make changes to the language that move it forward without a standard to get in the way.
Others are very unhappy. Haskell has been making constant breaking changes lately.
In my opinion the situation is disastrous and after using Haskell for over a decade, as of last year I no longer build any production system in Haskell. Keeping a system going is just a constant war against churn that wastes an incredible amount of time. Any time you might have saved by running Haskell, you will return because of the constant wastage.
What’s worse is that the changes being made are minor improvements whose fix don’t address any of the issues people have with the language.
It’s not just the core libraries that have this total disregard for the pain they inflict on users with large code bases. This message that breaking everything is a good idea has proliferated throughout the community. Libraries break APIs with a kind of wild abandon that I don’t see in any other language. The API of the compiler changes constantly to the point where tooling is 2+ years out of date. The Haskell Language Server still doesn’t have full 9.0 support 2 years later!
Haskell is a great language, but it’s being driven into the ground by a core of contributors who just don’t care about the experience of a lot of users.
HLS has supported 9.0 since July 2021, it recently gained support for 9.2.1 as well.
Are we really talking about the same language? I’m working full time on a >60k line Haskell codebase with dependencies on countless packages from Hackage, and none of the compiler version upgrades took me longer than half a day so far.
Now, don’t get me wrong, the churn is real. Library authors particularly get hit the worst and I hope the situation gets better for them ASAP, but I really think the impression you’re giving is exaggerated for an application developer hopping from stackage lts to stackage lts.
HLS had partial support for 9.0. And even that took more than half a year. The tactics plugin wasn’t supported until a few months ago. And stylish-haskell support still doesn’t exist for 9.0
Support for 9.2 is flaky at best. https://github.com/haskell/haskell-language-server/issues/2179 Even the eval plugin didn’t work until 2 weeks ago!
60k is small. That’s one package. I have an entire ecosystem for ML/AI/robotics/neuroscience that’s 10x bigger. Upgrades and breakage are extremely expensive, they take days to resolve.
In Python, JS, or C++ I have no trouble maintaining large code bases. In Haskell, it’s an unmitigated disaster.
Saying you have “no trouble maintaining larger codebases” with Python or JS seems a bit suspicious to me….
I am personally also a bit in the “things shouldn’t break every time” (like half a day’s work for every compiler release seems like a lot!) but There are a lot of difficulties with Python and JS in particular because API changes can go completely unnoticed without proper testing. Though this is perhaps way less of an issue if you aren’t heavy dep users.
py-redis released 2.0, Pipfile had “*” version, pipenv install auto-upgraded py-redis and bam, incompatible API. Larger the codebase, more frequently it happens.
Meanwhile, some C++/SDL code I committed to sf 20 years ago still compiles and runs fine.
I’ve worked on quite large Haskell codebases too, and cannot say that I’ve had any of the experiences you have - I’m sure you have but it’s not something that the community is shouting from the rooftops as being a massive issue like you’re claiming, and it might have much more to do with the libraries you rely on than GHC itself. This just comes across as FUD to me, and if someone told me Join Harrop wrote it, I would believe it.
Well, any amount of googling will show you many complaints about this. But I’ll pick the most extreme example. The person who runs stack (one of the two package managers) put his Haskell work on maintenance mode and is moving on to Rust because of the constant churn. https://twitter.com/snoyberg/status/1459118086909476879
I can’t imagine a worse sign for the language when these changes drive away the person who has likely done more than anyone to promote Haskell adoption in industry.
Don’t forget the creation of a working group specifically on this topic. It still remains to be seen if they have the right temperament to make the necessary changes.
The person who runs stack has been a constant source of really great libraries but also really painful conflict in the Haskell community. His choice to vocally leave the community (or something) is another example of his habit of creating derision. Whether it’s reflective of anything in the community or not is kind of pointless to ask: It feels like running a treadmill to him because he maintains way too many libraries. That load would burn out any maintainer, regardless of the tools. I feel for him, and I’m grateful to him, but I’m also really tired of hearing him blame people in the Haskell community for not doing everything he says.
Snoyman somewhat intentionally made a lot of people angry over many things, and chose not to work with the rest of the ecosystem. Stack had its place but cabal has reached a state where it’s as useful, baring the addition of LTS’, which have limited value if you are able to lock library versions in a project. While He may have done a lot to promote Haskell in industry, I know a lot of people using Haskell in industry, and very few of them actually use much of the Snoymanverse in production environments (conduit is probably the main exception, because http-conduit is the best package for working with streaming http data, and as such many other packages like Amazonka rely on it). I don’t know any people using Yesod, and many people have been burned by persistent’s magic leading to difficulties in maintenance down the road. I say all this as someone who recommended stack over cabal quite strongly because the workflow of developing apps (and not libraries) was much more pleasant with stack; but this is no longer true.
As someone who’s been using Haskell for well over a decade, the last few years have been fantastic in the pace of improvements in GHC. Yes some things are breaking, but this is the cost of paying off long held technical debt in the compiler. When GHC was stagnating, things were also not good, and I would prefer to see people attempting to fix important things while breaking some others than seeing no progress at all. The Haskell community is small, we don’t have large organisations providing financial backing to work on things like strong backwards compatibility, and this is the cost we have to pay because of that. It’s not ideal, but without others contributing resources, I’ll take positive progress in GHC over backwards compatibility any day (and even on that front, things have improved a lot, we used to never get point releases of previous compilers when a new major version had been released).
I think the breaking changes in haskell aren’t significant. Usually they don’t actually break anything unless you’re using some deep hacks.
Maybe for you. For me, and many other people who are trying to raise the alarm about this, changes like this cause an endless list of problems that are simply crushing.
It’s easy to say “Oh, removing this method from Eq doesn’t matter”. Well, when you depend on a huge number of libraries, it matters. Even fixing small things like this across a large code base takes time. But now I can’t downgrade compilers anymore unless I sprinkle ifdefs everywhere, so I need to CI against multiple compilers which makes everything far slower (it’s not unusual for large Haskell projects to have to CI against 4+ versions of GHC, that’s absurd). And do you know how annoying it is to have a commit and go through 3 different versions before you finally have the ifdefs right for all of the GHC variants you CI against?
Even basic things like git bisect are broken by these changes. I can’t easily look in the history of my project to figure out what’s going wrong. To top it all off I now need my own branches of other libraries I depend on who haven’t upgraded yet. This amounts to dozens of libraries. It also means that I need to upgrade in lockstep, because I can’t mix GHC versions. That makes deployments far harder. It’s also unpleasant to spend 10 hours upgrading, just to discover that somethings fundamental prevents you from switching, like a bug in GHC (I have never seen a buggier compiler of a mainstream language) or say, a tool like IHaskell not being updated or suffering from bugs on the latest GHC. I could go on.
Oh, and don’t forget how because of this disaster you need to perfectly match the version of your tools to your compiler. Have an HLS binary or IHaskell binary that wasn’t compiled with your particular compile version, you get an error. That’s an incredibly unfriendly UX.
Simply put, these breaking changes have ramifications well beyond just getting your code to compile each time.
Let’s be real though, that’s not the list of changes at all. The Haskell community decided to very narrowly define what a breaking change to the point of absurdity. Breaking changes to cabal? Don’t count. Breaking changes to the GHC API? Don’t count, even though they break a lot of tooling. Even changes to parts of the GHC API that you are supposed to use as a non-developer of the compiler, like the plugin API don’t count. Breaking changes to TH? Don’t count. etc.
Is having notebook support for Haskell a deep back? Because GHC has broken IHaskell and caused bugs in it countless times. Notebook support is like table stakes for any non-toy language.
If even the most basic tools you need to be a programming language mean you rely on “deep hacks” so apparently you deserve to be broken, well that’s a perfect reflection of how incredibly dysfunctional Haskell has become.
In a sciencey kind of context only. Systems, embedded, backend, GUI, game, etc. worlds generally do not care about notebooks.
I have never, not even once, thought about installing Jupyter again after finishing a statistics class.
Interesting. May I ask why?
Because I have no need for it. That concept doesn’t fit into any of my workflows. I generally just don’t do exploratory stuff that requires rerunning pieces of code in arbitrary order and seeing the results inline. Pretty much all the things I do require running some kind of “system” or “server” as a whole.
Thank you for you answer. I always thought that work in a statistical setting (say, pharma, or epidemics), requires a bit of explorative process in order to understand the underlying case better. Tools like SAS kind of mirror the workflow in Jupyter.
What kind of statistical processes do you work with, and what tools do you use?
I don’t! I don’t do statistics! I hate numbers! :D
I’m sorry if this wasn’t clear, but “finishing a statistics class” wasn’t meant to imply “going on to work with statistics”. It just was a mandatory class in university.
The first thing I said,
was very much a “not everybody does statistics and there’s much more of the other kinds of development” thing.
Thanks!
I’ve never used a notebook in my career, so…
In any case, I think you’ve got a set of expectations for what haskell is, and that set of expectations may or may not match what the community at large needs, and you’re getting frustrated that haskell isn’t meeting your expectations. I think the best place to work that out is in the mailing lists.
They are pretty nice. Kind of like a non-linear repl with great multi line input support. It can get messy (see also non-linear), but great for hacking all kinds of stuff together quickly.
The way that IHaskell is implemented, I would actually consider it a deep hack, since we poke at the internals of the GHC API in a way that amounts to a poor rewrite of
ghci
(source: am the current maintainer). I don’t know that it’s fair to point to this as some flaw in GHC. If we didn’t actually have to execute the code we might be able to get away with usingghc-lib
orghc-lib-parser
which offers a smoother upgrade path with less C pre-processor travesties on our end.Sure! I’m very familiar as I’ve contributed to IHaskell, we’ve spoken through github issues.
I wasn’t making a technical point about IHaskell. That was a response to the idea that some projects need to suffer because they’re considered “deep hacks”. Whatever that is. As if those projects aren’t worthy in some way.
I really appreciate the maintenance of IHaskell. But if you take a step back and look at the logs, it’s shocking how much time is spent on churn. The vast majority of commits aren’t about adding features, more stability, etc. Making IHaskell as awesome as it can be. They’re about keeping up with arbitrary changes in GHC and the ecosystem. Frankly, upstream Haskell folks are just wasting the majority of the time of everyone below them.
I can definitely relate to the exhaustion brought on by the upgrade treadmill, but nobody is forcing folks to use the latest and greatest versions of packages in the Haskell ecosystem and I also don’t think the GHC developers owe it to me to maintain backwards compatibility in the GHC API (although that would certainly make my life a little easier). A lot of the API changes are related to improvements in the codebase and new features, and I personally think the project is moving in a positive direction so I don’t agree that the Haskell folks are wasting my time.
At my current job we were quite happily using GHC 8.4 for several years until last month, when I finally merged my PR switching us over to GHC 8.10. If I hadn’t decided this was something I wanted to do we probably would have continued on 8.4 for quite a while longer. I barely had any issues with the upgrade, and most of my time was spent figuring out the correct version bounds and wrestling with
cabal
.Could you not use something like the
hint
library which appears to abstract some of that GHC API into something a little more stable and less hacky?Great question! We wouldn’t be able to provide all the functionality that IHaskell does if we stuck to the
hint
API. To answer your question with another question: why doesn’tghci
usehint
? As far as I can tell, it is becausehint
only provides minimal functionality around loading and executing code, whereas we want the ability to do things such as::type
,:kind
,:sprint
,:load
, etc., which are implemented inghci
but not exposed through the GHC APIArguably there is room for another Haskell kernel that does use
hint
and omits these features (or implements them more simply), but that would be a different point on the design space.So far in practice updating IHaskell to support a newer version of GHC takes me about a weekend each time, which is fine by me. I even wrote a blog post about the process.
Thanks for the thoughtful and detailed response! As someone who has used IHaskell in the past, I really want it to be as stable and easy to use as any other kernel is with Jupyter.
Me too! From my perspective most of the issues I see are around installing IHaskell (since Python packaging can be challenging to navigate, and Haskell packaging can also be challenging to navigate, so doing together is especially frustrating) and after that is successfully accomplished not that many people have had problems with stability (that I am aware of from keeping an eye on the issue tracker).
Python packaging is it’s own mess so no matter what happens on the Haskell side there is likely always going to be a little pain and frustration. I was struck by your blogpost how many of the changes you made that were due to churn. Things like functions being renamed. Why couldn’t GHC people put a deprecation pragma on the old name, change its definition to be equal to the new name and go from there? It would be nice if all you needed to get 9.0 support was update the cabal file.
With the way churn happens now I wouldn’t be surprised if in a few months there is a proposal to just rename
fmap
tomap
. After all this change should save many of us a character and be simple for all maintainers to make.You’re right that it would be nice to just update the
.cabal
file, but when I think about the costs and benefits of having a bunch of compatibility shims (that probably aren’t tested and add bulk to the codebase without providing additional functionality) to save me a couple of hours of work every six months I don’t really think it makes sense. It’s rare that only the names change without any other functionality changing (and in that case it’s trivial for me to write the shims myself), so deprecating things really only has the effect of kicking the can down the road, since at some point the code will probably have to be deleted anyway.I think the larger issue here is that it’s not clear who the consumers of the GHC API are, and what their desires and requirements are as GHC continues to evolve. The GHC developers implicitly assume that only GHC consumes the API, and although that’s not true it’s close enough for me. I harbour no illusions that IHaskell is a particularly important project, and if it disappeared tomorrow I doubt it would have much impact on the vast majority of Haskell users. As someone who has made minor contributions to GHC, I’m impatient enough with the current pace of development that I would rather see even greater churn if it meant a more robust and powerful compiler with a cleaner codebase than for them to slow down to accommodate my needs as an IHaskell maintainer. It seems like they’re slowly beginning to have that conversation anyway as more valuable projects (e.g.
haskell-language-server
) begin to run up against the limitations of the current system.I agree that the GHC API is generally treated as an internal argument. I also think between template-haskell, IHaskell, and the inadequacies of hint that there is a need for a more stable public API. And I think having one is a great idea. Libraries to let you more easily manipulate the language, the compiler, and the runtime are things likely to be highly valued for programming language enthusiasts. Maybe more so than the average programmer.
I don’t really want to engage with your rant here. I’m sorry you’re having so many issues with haskell, but it doesn’t reflect my experience of the ecosystem and tooling.
[Edit: Corrected a word which folks objected to.]
I don’t know what to tell you. A person took the time to explain how the tooling and ecosystem make it very hard to keep packages functioning from version to version of the main compiler. And you just offer a curt dismissal. It’s an almost herculean effort to write libraries for Haskell that work for all 8.x/9.x releases. This combined with a culture of aggressive upper-bounds on package dependencies makes it very challenging to use any libraries that are not actively maintained.
And this churn does lead to not just burnout of people in the ecosystem, but the sense that less Haskell code works every day that passes. Hell, you can’t even reliably get a binary install of the latest version of GHC which has been out for several months. The Ubuntu PPA page hasn’t been updated in a year.
Many essential projects in the Haskell ecosystem have a bus-factor of one, and it’s hard to find people to maintain these projects. The churn is rough.
I’m sorry for being dismissive. Abarbu’s response to my very short comment was overwhelming, and so I didn’t want to engage.
For upper bounds on packages I typically use nix flakes to pin the world and
doJailbreak
to ignore version bounds. I believe you can do the same instack.yaml
withallow-newer:true
. The ecosystem tools make dealing with many issues relatively painless.Getting a binary install of the latest version of GHC requires maintainers and people that care. But, if as abarbu says “I have never seen a buggier compiler of a mainstream language,” then I would recommend not upgrading to the latest GHC until your testing it shows that it works. If there aren’t packages released or the new version causes bugs, then why not stay on the current version?
Breaking changes in the language haven’t ever burned me. If it’s causing people problems, writing about specific issues in the Haskell mailing lists is probably the best way to get help. It has the nice side effect of teaching the GHC developers how their practices might cause problems for the community.
A lot of us are saying this stuff as people who have used the language for many years. That you need to use nix + assorted hacks for it to be usable reflects the sad state of the ecosystem. I’d go as far to say it’s inadvisable to compile any non-trivial Haskell program outside it’s own dedicated nix environment. This further complicates using programs written in Haskell, nevermind packaging them for external users. I have had
ghci
refuse to run because somehow I ended with a piece of code that depended on multiple versions of some core library.It’s a great language, but the culture has lead to an ecosystem that is rough to work with. An ecosystem that requires lots of external tooling to use productively. I could complain about the bugginess of GHC and how the compiler has been slower every release for as long as I can remember, but that misses the real pain point. The major pain point is that GHC team doesn’t value backwards compatibility, proper deprecation capabilities, or even tooling to make upgrading less painful. Their indifference negatively affects everyone downstream that has to waste time on pointless maintenance tasks instead of making new features.
For context, I started learning haskell about 11 years ago and have been using it extensively for about 7 years. I started when cabal hell was a constant threat, and if you lost your development environment you’d never compile that code again.
From my perspective, everything is much better now. Nix pinning + overrides and Stack resolvers + extra-deps are great tools to construct and remember build plans, and I’m sure Cabal has grown some feature along with “new-build” commands to save build plans.
I think having three great tools to choose from is pretty great. The underlying problem is allowing version upper-bounds in the cabal-project file-format.
After the binary is compiled none of compilation and dependency searching problems exist. Package the binary with its dynamic libs, or produce a statically linked binary.
I worked with Go for four years. When you work with go you have go-tool, go-vet, counterfeiter, go-gen, go-mod, and at least two other dependency management tools (glide? glade? I can’t remember). Nobody is complaining about there being “too many external tools” in the go ecosystem. Don’t get me started on java tooling. Since when has the existence of multiple tools to deal with dependencies and compilation been a bad signal.
This is biting the hand that feeds, or looking the gift horse in the mouth or something. The voices in the community blaming their problems on the GHC team are not helping things, imo. Sorry. There’s a lot of work to be done, and the GHC team are doing a good job. That there also exists active research going on in the compiler is unusual, but that’s not the “GHC team doesn’t value backwards compatibility” or “their indifference”, that’s them being overloaded and saying “sure, you can add that feature, just put it behind an extension because it’s not standard” and going back to fixing bugs or optimizing things.
This is an issue that has provoked the creation of a working group by the Haskell Foundation as well as this issue thread
https://github.com/haskell/core-libraries-committee/issues/12
Many of the people weighing in are not what I would call outsiders. I’ve contributed plenty to Haskell, and I complain out of a desire to see the language fix what is in my opinion one of it’s most glaring deficiencies. If I didn’t care, I’d just quietly leave the community like many already have. The thread linked above even offers some constructive solutions to the problem. Solutions like migration tools so packages can be upgraded more seamlessly. Perhaps some shim libraries full of CPP macros that lets old code keep working for more than two releases. Maybe a deprecation window for things like base that’s close to 2 years instead of one.
Like how wonderful would it be if there was a language extension like GHC2015 or GHC2020 and I could rest assured the same code would still work in 10 years.
Calling that Gish-gallop is pretty dismissive. It’s not like abarbu went off on random stuff. It’s all just about how breaking changes (or worse non breaking semantic changes) make for unpleasant churn that damages a language.
I understand the churn can be unpleasant. Abarbu’s response to me was overwhelming and I didn’t want to engage. I am sorry for being dismissive.
…do you know what a gish gallop is?
No, guess I do not, and I need to be corrected by lots of people on the internet. Thanks.
Ha, I didn’t know either, so I googled it, and it maybe sounded harsher than you meant. It was a gallop, for sure, if not a gish gallop.
Ngl, that kind of sounds like Rust
why exactly ? the rust editions stay stable and many crates settled on 1.0 versions
Perhaps this[1] was a lighter-weight solution to some of the problems the author mentions.
[1] - https://github.com/ghc-proposals/ghc-proposals/blob/master/proposals/0380-ghc2021.rst
I’m thinking that the first result when DuckDuckGo’ing “haskell2020” being this article is not a good sign.
That grammatical form in the title “all hope is not lost” drives me semi crazy, because it’s a valid statement and almost certainly not what they meant, which was probably “not all hope is lost”.
cf moving universal and existential quantifiers around in boolean expressions.
They’re the same unless you’re a mathematician.
Not to worry, no hope was lost whatsoever!
:)
I just stick with Haskell2010. Are there things I’d like to see included? Sure. Do I need them? No.