Not really. You just commented before you finished reading. Next time, read the entire article first. It can cost the first comment, but sometimes it’s more important to have a complete and well-thought-through comment than to be the first person to reply.
Shitty to question motivations, that they didn’t finish because they were rushing to reply first. If I drink a glass of milk and find that it is spoiled I don’t have to finish it to say it’s no good. If satire is insufficiently engaging for folks to get to the point buried at the end then it ain’t good.
I see your point, but it’s not congruent with the evidence. Their top-level reply, which really was the first one according to the timestamps, does not engage with the bulk of the transcript. Instead, it picks one claim from near the beginning and counters that claim alone. I can only think of two reasons for this reply; either it was meant to be the first reply, or it was an emotional reaction to somebody talking negatively about Haskell. Either way, we can learn a lesson.
Nothing about the article changes my comment. The epilogue does let one know the whole thing is a bad faith troll, but in case someone else is tricked to think it’s a real argument, my comment stands
I think the last line on the Gist should’ve been the first:
As one might have guessed, this is not an essay. It’s a transcript of the following talk by R. Martin with some substitutions made (SmallTalk -> Haskell, Ruby -> Rust, and others). You are free to make any conclusions from this.
The section where they said it was used in a big payroll project is the bit that give it away for me. I enjoyed coming to that realization on my own without having it spoon-fed to me, but I understand that it would certainly drive away a certain class of reader.
In case anyone’s wondering about context, it seems like, if you click on the title of a post at the top of the page, it opens a whole new page. I’m not sure when this was added but I think it’s a recent feature?
Considering how the argument devolved into increasingly less specific platitudes, I assumed it would be revealed that ChatGPT wrote the lot of it. The actual epilogue surprised me.
Starting off such a diatribe with “Some of you know what that means, and the rest of you will wonder for a very long time” to shit on people not “smart enough” to know what higher kinded types are is not a super compelling way to make me interpret this as being FP-elitism again. Also HKT are not special if even C++ has them.
A lot of things contributed to making haskell “fail” in a general language popularity sense, not least of which was the FP-elitism signaling that was endemic to the community, and the insistence on using non-standard terminology for routine operations done elsewhere (seriously I had one of my masters examiners trying to get me to replace free variable analysis, hoisting, etc with the various FP terms which have zero meaning outside of FP research when the whole point of my thesis was getting Haskell running on non-FP-centric runtimes)
There are numerous things that contribute to causing Haskell to “fail”, but the standard ones are performance:
CPU Performance: You can write fast code, but that is not the default behaviour, and frequently many of the first required steps are manually inhibiting the things that ostensibly make haskell “great”
Memory performance: harder to tackle, and again requires significant work as default behaviour is significantly more memory hungry. Even with that effort the GC overheard rules it out for most production kernels, and completely rules it out for low resource devices where the dead memory needed to maintain GC performance is not feasible.
But I think the real one is the more or less all of the features highlighted as making haskell better exist in imperative languages as well now, they aren’t inherently tied to the lambda calculus model of computation: higher order types, lambda expressions, and type inference are all pretty much standard even in C++. Haskellers love to say that “haskell is the best imperative programming language”, when in reality Haskell’s support for imperative logic or interfaces is vastly clunkier than any modern imperative language’s support for functional development.
Rust does not have those issues, because it put the practical “we need this to work as a systems language”, “it needs to not have an alien computational model”, and “the defaults must be biased for practical performance”. The meant it is useful across a wide array of different environments, and as such is also of interest to actual corporations who can fund it (Mozilla, google, ..)
I don’t get the point about “discipline” related to documentation. Rust managed to grow despite not having cohesive documentation for the language itself (i.e. a proper spec), but it does have good books, error messages that almost teach the language, and docs.rs.
While Rust is assertive about the things it’s good at, the community is not insular. I do worry it’ll have its “eternal september” and it won’t be possible to manage what “Rust community” is any more, but that’s the opposite problem: it’s not from rejection of others, but from blending with the general programming community.
Rust already has good adoption in enterprise, to the point people worry that the Rust Foundation is too corporation-oriented.
I agree about both documentation and existing wide use. If I accept the author’s offer to draw my own conclusions, the perception of snooty Rust evangelists may have been the part that fit best to the original text, and the rest was left as-is to make the joke/trick work.
I thought the thing that killed Haskell was the same thing that killed APL (coding samples with letters that don’t exist on IBM compatible PC keyboards)
Reading the epilogue, I think we need to seriously consider that basically anything in this vein is a recycling of prejudices because there’s both a lack of detailed study to back these things but also because the sample size of popular programming languages that become unpopular is actually quite small, and often have not actually shrunk in terms of absolute numbers of users or absolute (inflation adjusted) amount of business being done by systems developed in that language.
The Smalltalk -> Haskell equivalence kind of fails because stuff like
There was a time when Haskell was the language to watch. During the late 2000s through the 2010s, Haskell was the language everybody wished they could program in,
That’s the point. It is the second big clue, after a date of 2030, that this is a sort of alternate-universe talk… and since you know that this is not the universe that we are actually in, therefore the intent is that this makes you curious about where it’s from: what the alternate world is where Haskell was so dominant, and what the author is getting at, and what this thought experiment has to teach us.
I’d initially drafted a longer post, and it started like this:
I’m still debating whether I should post this because, well, it’s not nice, and also not quite code, and I’m still angry at myself for that one time when I started a political flamethread here and I’m afraid of doing that again.
Now, “debating whether I should post something” is usually good indication that I shouldn’t. But I will say this: this article is two years old by now, and it keeps getting posted and reposted, and gathers quite some reaction, often by and, respectively, from, people who aren’t part of the Rust community.
Now that could be just because people like flamewars. There’s a bit of comp.lang.lisp in every one of us.
I would note, however, that the way the community of a budding language is perceived, and the way it treats people who have marginal experience with that language, but a lot of experience in fields where the language might be adopted, play a huge role in the adoption that the language eventually experiences.
The number of factually accurate points that this tongue-in-cheek essay can make is obviously limited by the “artistic device” that it employs. But the mere fact that it was written, and that people outside the Rust community get it, is indicative of more than just how conservative and passive-aggressive programmers with backgrounds in other languages are.
Haha, I love the saltiness by Rust-evangelists as a reaction to this essay! Take it easy, people, it’s just a programming language and the essay is only half-serious.
In a way, it proves some points of the essay about the Rust-community, though, which is a shame given Rust could be so much more without the arrogance and toxicity of many of its proponents.
But what is the point of it, if the whole thing is just a joke from someone replacing 2 words and you shouldn’t take it serious ? Because either this is meant as a real warning, so people take it serious and write up why it can or can’t be true, especially since there are some contributors on this site. Or it’s just a flamebait from people wanting to see “the saltiness by Rust-evangelists” and actually has no real content. In which case we could either flag this submission as offtopic or tag it as satire.
the arrogance and toxicity of many of its proponents
It’s think it is unkind to call people toxic for writing factual reasons why this is wrong (even the non rust part). Also where are people showing arrogance here ?
I could think of multiple actual reasons why rust could fail (and some people can do that even better), so I’m actually interested whether some shitpost like this may have some truth. Throw me a list of criticism that goes high on hacker news about my personal passion project, and oh wonder I’ll try to reason whether this is actually true. We’ve seen this from “C-People” a lot of the time when “their” language got criticized (which happened a lot, honestly).
it proves some points of the essay about the Rust-community, though
Oh, my, is that ever true.
I am a tech journalist. I mostly try to avoid writing about Rust now, because whenever I do, the Rust fans are inevitably up in arms and call me biased, fling accusations, write to my editor to complain about me, bitch endlessly in various fora, and so on.
The only other group who give me so much grief are the old-school C fans, who also cannot abide any criticism of their beloved language. But C is a fading star now, and the world is getting more and more over it, and so it can be quite fun to mock C enthusiasts. Of course, there are also things like people telling me to kill myself on Twitter. :-/
Most other programming languages seem to have communities where if you mention their language, they are pleased about it. They might ask for changes, or add more info, but they’re happy someone gets it.
If I write anything about Rust which is not unstinting praise, untinted by the slightest mention of criticism, then I will get some angry unhappy fans on my case minutes later. It makes me want to avoid the whole area, TBH.
Well, even though I’m an old-school C-fan myself (I’m a member of suckless.org after all :D), I know its weaknesses and embrace static analysis tools to balance things out a bit, but wouldn’t ever start a flame war to “win” something. Most decisively is probably the aspect of usage rate, and Rust gained a lot in this regard with Google adopting it for Android, to give just one example, but Google tends to change its mind often and might end up with their new language Carbon in the end. Most bugs may be memory related, but those are already mitigated with languages that offer way weaker memory-guarantees than Rust.
A big aspect with software for me is how solid of a foundation it is to build something atop. And Rust really is heavily lacking in this regard: The language does not have a formal specification, the packaging system triggers my npm-PTSD (“left-pad”) and even trivial programs end up depending on a multitude of scattered GitHub-projects due to the lack of a decisiveness in regard to a useful standard library, and there don’t seem to be any provisions to stabilize this in any way. Writing something in Rust now amounts to a big up-front investment in the belief that all these tiny microlibraries will be well-maintained for years to come.
Given Rust is still relatively young, it hasn’t already gone through this process, but it will at some point. And I’m not too optimistic about economic and other factors; maybe we might end up with a more federal internet at some point again and centralized hubs like GitHub (owned by Microsoft, after all, which probably gets a lot of tracking data from all the Cargo-installations pulling in dependencies) will not be as solid ground as we believe it to be. Packaging systems like Debian’s are very good examples for decentralization, and it’s painful for them to package Rust applications given their authors’ insistence on rapid changes.
Companies will also do the math: Rewriting is always a tough choice linked with high costs, and that’s where Google really kicks in with Carbon and its easy “translatability” from C++.
Not a Rust nor a Haskell developer. I think two years later things seem to look good on the saving side.
Functional programming languages are often or put into the “for math” corner for many good and bad reasons.
I think Rust making clear it wants to be a C++ replacement and sticking to that premise helped. I also think the trope of Rust being super hard helped with making the community more welcoming. I know Haskell had that for a short time as well, but being more focused on purity, which feels like a fire value this was only reflected in tutorials and less in trading purity for convenience. As someone who now and then read news on Rust when they were on lobsters I think they actually at times feel more like doing the Perl thing (TIMTOWTDI) with adding semi-hacky convenience. I personally tend to not be a huge fan of this because it makes it hard to understand the whole language. And code from X years ago starts to feel hard to impossible to read especially for newcomers which don’t know the old ways.
To be honest I think this is a big reason for why new languages always feel so great. They have none of these hacks and old and new ways of doing stuff.
Like I said I’m not a Rust programmer but given that this hits a lot of languages while I see that Rust is used in enough areas that it likely won’t go the Haskell way in the mid to long term I think in the long term it might accumulate a lot of “old baggage”. That’s usually especially true for the ecosystem parts. The hype b usually goes down overtime but code and libraries stick around and maybe someone wants to do Rust 2 to get rid of that baggage but that might be when people switch away for good. You need to be really big like Python to survive this and then it took a long time there. And compared to other languages their changes were actually not that big for many projects.
Epilogue not withstanding, I think the decrease in excitement around Haskell (to whatever degree we can agree is real), it has a lot less to due with the language, programming style, etc and a lot more to do with the cloud. At the end of the 00s everyone realized that the improvement in CPU speed was going to tail off and we needed to focus on parallel execution on one hand and performance-per-watt on the other. Then came the cloud and made it easier to parallelize across machines rather than across CPU cores. The server rack space issue and the energy issue became the cloud provider’s problem (margin?). Eventually that trend will slow and we will once again be focusing on compute density at smaller scales. I’m not sure what that will look like or if enthusiasm for Haskell will pick back up though.
This is just… Super false?
It’s not the most popular language of the week, but it never was
The last section of the post is really really really super duper ultra mega important to read before commenting.
Also the header says that it’s fiction which supposedly takes place in 2030.
Having now read it, I guess the whole post is in bad faith
Not really. You just commented before you finished reading. Next time, read the entire article first. It can cost the first comment, but sometimes it’s more important to have a complete and well-thought-through comment than to be the first person to reply.
Shitty to question motivations, that they didn’t finish because they were rushing to reply first. If I drink a glass of milk and find that it is spoiled I don’t have to finish it to say it’s no good. If satire is insufficiently engaging for folks to get to the point buried at the end then it ain’t good.
Wouldn’t saying the article is written in bad faith also fall under the same umbrella?
I see your point, but it’s not congruent with the evidence. Their top-level reply, which really was the first one according to the timestamps, does not engage with the bulk of the transcript. Instead, it picks one claim from near the beginning and counters that claim alone. I can only think of two reasons for this reply; either it was meant to be the first reply, or it was an emotional reaction to somebody talking negatively about Haskell. Either way, we can learn a lesson.
Nothing about the article changes my comment. The epilogue does let one know the whole thing is a bad faith troll, but in case someone else is tricked to think it’s a real argument, my comment stands
Satire is neither bad-faith nor a troll. It must be unlabeled in order to make its point. See The Onion’s recent amicus in favor of unlabeled satire, in Novak v. City of Parma. Additionally, one could argue that the post is like a koan or snowclone; the juxtaposition of unexpected words within a preselected template is meant to interrupt the reader’s typical worldview.
The post is also from over two years ago …
I think the last line on the Gist should’ve been the first:
The section where they said it was used in a big payroll project is the bit that give it away for me. I enjoyed coming to that realization on my own without having it spoon-fed to me, but I understand that it would certainly drive away a certain class of reader.
Read it right to the end. Or at least the epilogue.
How dare you ask me to violate the posters creed.
In case anyone’s wondering about context, it seems like, if you click on the title of a post at the top of the page, it opens a whole new page. I’m not sure when this was added but I think it’s a recent feature?
Considering how the argument devolved into increasingly less specific platitudes, I assumed it would be revealed that ChatGPT wrote the lot of it. The actual epilogue surprised me.
Starting off such a diatribe with “Some of you know what that means, and the rest of you will wonder for a very long time” to shit on people not “smart enough” to know what higher kinded types are is not a super compelling way to make me interpret this as being FP-elitism again. Also HKT are not special if even C++ has them.
A lot of things contributed to making haskell “fail” in a general language popularity sense, not least of which was the FP-elitism signaling that was endemic to the community, and the insistence on using non-standard terminology for routine operations done elsewhere (seriously I had one of my masters examiners trying to get me to replace free variable analysis, hoisting, etc with the various FP terms which have zero meaning outside of FP research when the whole point of my thesis was getting Haskell running on non-FP-centric runtimes)
There are numerous things that contribute to causing Haskell to “fail”, but the standard ones are performance:
CPU Performance: You can write fast code, but that is not the default behaviour, and frequently many of the first required steps are manually inhibiting the things that ostensibly make haskell “great”
Memory performance: harder to tackle, and again requires significant work as default behaviour is significantly more memory hungry. Even with that effort the GC overheard rules it out for most production kernels, and completely rules it out for low resource devices where the dead memory needed to maintain GC performance is not feasible.
But I think the real one is the more or less all of the features highlighted as making haskell better exist in imperative languages as well now, they aren’t inherently tied to the lambda calculus model of computation: higher order types, lambda expressions, and type inference are all pretty much standard even in C++. Haskellers love to say that “haskell is the best imperative programming language”, when in reality Haskell’s support for imperative logic or interfaces is vastly clunkier than any modern imperative language’s support for functional development.
Rust does not have those issues, because it put the practical “we need this to work as a systems language”, “it needs to not have an alien computational model”, and “the defaults must be biased for practical performance”. The meant it is useful across a wide array of different environments, and as such is also of interest to actual corporations who can fund it (Mozilla, google, ..)
I don’t think any of the mentioned things apply.
I don’t get the point about “discipline” related to documentation. Rust managed to grow despite not having cohesive documentation for the language itself (i.e. a proper spec), but it does have good books, error messages that almost teach the language, and docs.rs.
While Rust is assertive about the things it’s good at, the community is not insular. I do worry it’ll have its “eternal september” and it won’t be possible to manage what “Rust community” is any more, but that’s the opposite problem: it’s not from rejection of others, but from blending with the general programming community.
Rust already has good adoption in enterprise, to the point people worry that the Rust Foundation is too corporation-oriented.
Rust has got way more traction than Haskell ever had (1, 2). Something will eventually dethrone it, but I don’t think it’s facing the same problems as Haskell.
I agree about both documentation and existing wide use. If I accept the author’s offer to draw my own conclusions, the perception of snooty Rust evangelists may have been the part that fit best to the original text, and the rest was left as-is to make the joke/trick work.
I thought the thing that killed Haskell was the same thing that killed APL (coding samples with letters that don’t exist on IBM compatible PC keyboards)
Reading the epilogue, I think we need to seriously consider that basically anything in this vein is a recycling of prejudices because there’s both a lack of detailed study to back these things but also because the sample size of popular programming languages that become unpopular is actually quite small, and often have not actually shrunk in terms of absolute numbers of users or absolute (inflation adjusted) amount of business being done by systems developed in that language.
The Smalltalk -> Haskell equivalence kind of fails because stuff like
reads like a sentence from an alternate reality
That’s the point. It is the second big clue, after a date of 2030, that this is a sort of alternate-universe talk… and since you know that this is not the universe that we are actually in, therefore the intent is that this makes you curious about where it’s from: what the alternate world is where Haskell was so dominant, and what the author is getting at, and what this thought experiment has to teach us.
I’d initially drafted a longer post, and it started like this:
Now, “debating whether I should post something” is usually good indication that I shouldn’t. But I will say this: this article is two years old by now, and it keeps getting posted and reposted, and gathers quite some reaction, often by and, respectively, from, people who aren’t part of the Rust community.
Now that could be just because people like flamewars. There’s a bit of comp.lang.lisp in every one of us.
I would note, however, that the way the community of a budding language is perceived, and the way it treats people who have marginal experience with that language, but a lot of experience in fields where the language might be adopted, play a huge role in the adoption that the language eventually experiences.
The number of factually accurate points that this tongue-in-cheek essay can make is obviously limited by the “artistic device” that it employs. But the mere fact that it was written, and that people outside the Rust community get it, is indicative of more than just how conservative and passive-aggressive programmers with backgrounds in other languages are.
Haha, I love the saltiness by Rust-evangelists as a reaction to this essay! Take it easy, people, it’s just a programming language and the essay is only half-serious.
In a way, it proves some points of the essay about the Rust-community, though, which is a shame given Rust could be so much more without the arrogance and toxicity of many of its proponents.
But what is the point of it, if the whole thing is just a joke from someone replacing 2 words and you shouldn’t take it serious ? Because either this is meant as a real warning, so people take it serious and write up why it can or can’t be true, especially since there are some contributors on this site. Or it’s just a flamebait from people wanting to see “the saltiness by Rust-evangelists” and actually has no real content. In which case we could either flag this submission as offtopic or tag it as satire.
It’s think it is unkind to call people toxic for writing factual reasons why this is wrong (even the non rust part). Also where are people showing arrogance here ?
I could think of multiple actual reasons why rust could fail (and some people can do that even better), so I’m actually interested whether some shitpost like this may have some truth. Throw me a list of criticism that goes high on hacker news about my personal passion project, and oh wonder I’ll try to reason whether this is actually true. We’ve seen this from “C-People” a lot of the time when “their” language got criticized (which happened a lot, honestly).
Oh, my, is that ever true.
I am a tech journalist. I mostly try to avoid writing about Rust now, because whenever I do, the Rust fans are inevitably up in arms and call me biased, fling accusations, write to my editor to complain about me, bitch endlessly in various fora, and so on.
The only other group who give me so much grief are the old-school C fans, who also cannot abide any criticism of their beloved language. But C is a fading star now, and the world is getting more and more over it, and so it can be quite fun to mock C enthusiasts. Of course, there are also things like people telling me to kill myself on Twitter. :-/
Most other programming languages seem to have communities where if you mention their language, they are pleased about it. They might ask for changes, or add more info, but they’re happy someone gets it.
If I write anything about Rust which is not unstinting praise, untinted by the slightest mention of criticism, then I will get some angry unhappy fans on my case minutes later. It makes me want to avoid the whole area, TBH.
Thanks for your response and reflections!
Well, even though I’m an old-school C-fan myself (I’m a member of suckless.org after all :D), I know its weaknesses and embrace static analysis tools to balance things out a bit, but wouldn’t ever start a flame war to “win” something. Most decisively is probably the aspect of usage rate, and Rust gained a lot in this regard with Google adopting it for Android, to give just one example, but Google tends to change its mind often and might end up with their new language Carbon in the end. Most bugs may be memory related, but those are already mitigated with languages that offer way weaker memory-guarantees than Rust.
A big aspect with software for me is how solid of a foundation it is to build something atop. And Rust really is heavily lacking in this regard: The language does not have a formal specification, the packaging system triggers my npm-PTSD (“left-pad”) and even trivial programs end up depending on a multitude of scattered GitHub-projects due to the lack of a decisiveness in regard to a useful standard library, and there don’t seem to be any provisions to stabilize this in any way. Writing something in Rust now amounts to a big up-front investment in the belief that all these tiny microlibraries will be well-maintained for years to come.
Given Rust is still relatively young, it hasn’t already gone through this process, but it will at some point. And I’m not too optimistic about economic and other factors; maybe we might end up with a more federal internet at some point again and centralized hubs like GitHub (owned by Microsoft, after all, which probably gets a lot of tracking data from all the Cargo-installations pulling in dependencies) will not be as solid ground as we believe it to be. Packaging systems like Debian’s are very good examples for decentralization, and it’s painful for them to package Rust applications given their authors’ insistence on rapid changes.
Companies will also do the math: Rewriting is always a tough choice linked with high costs, and that’s where Google really kicks in with Carbon and its easy “translatability” from C++.
From my monitoring tool
Not a Rust nor a Haskell developer. I think two years later things seem to look good on the saving side.
Functional programming languages are often or put into the “for math” corner for many good and bad reasons.
I think Rust making clear it wants to be a C++ replacement and sticking to that premise helped. I also think the trope of Rust being super hard helped with making the community more welcoming. I know Haskell had that for a short time as well, but being more focused on purity, which feels like a fire value this was only reflected in tutorials and less in trading purity for convenience. As someone who now and then read news on Rust when they were on lobsters I think they actually at times feel more like doing the Perl thing (TIMTOWTDI) with adding semi-hacky convenience. I personally tend to not be a huge fan of this because it makes it hard to understand the whole language. And code from X years ago starts to feel hard to impossible to read especially for newcomers which don’t know the old ways.
To be honest I think this is a big reason for why new languages always feel so great. They have none of these hacks and old and new ways of doing stuff.
Like I said I’m not a Rust programmer but given that this hits a lot of languages while I see that Rust is used in enough areas that it likely won’t go the Haskell way in the mid to long term I think in the long term it might accumulate a lot of “old baggage”. That’s usually especially true for the ecosystem parts. The hype b usually goes down overtime but code and libraries stick around and maybe someone wants to do Rust 2 to get rid of that baggage but that might be when people switch away for good. You need to be really big like Python to survive this and then it took a long time there. And compared to other languages their changes were actually not that big for many projects.
Epilogue not withstanding, I think the decrease in excitement around Haskell (to whatever degree we can agree is real), it has a lot less to due with the language, programming style, etc and a lot more to do with the cloud. At the end of the 00s everyone realized that the improvement in CPU speed was going to tail off and we needed to focus on parallel execution on one hand and performance-per-watt on the other. Then came the cloud and made it easier to parallelize across machines rather than across CPU cores. The server rack space issue and the energy issue became the cloud provider’s problem (margin?). Eventually that trend will slow and we will once again be focusing on compute density at smaller scales. I’m not sure what that will look like or if enthusiasm for Haskell will pick back up though.