Speaking from experience, it’s quite difficult to keep up with pull requests for a popular programming language. It seems like the author of this blog post is implying that they will somehow be able to stay more on top of it than Walter. It’s not clear to me, and it doesn’t seem to be discussed in the blog post, what their plan is for making that happen. However, it’s a noble goal, and I sincerely wish them luck.
I already kept up with pull requests from upstream. I looked at every one of them, read every forum post and every bug report. It isn’t actually that busy (perhaps D isn’t a popular programming language? idk). Typical week would have ~30ish PRs and < 10 bugs. Most the PRs are utterly trivial and most from the same people; if you want to look it over, it takes a couple minutes and if you don’t, you can realistically trust the authors to have done their job and just automatically hit merge based on who wrote it.
Of course, if it really took off and had hundreds a day, that might be different, but as it is right now, keeping up isn’t difficult at all. Cross the other bridge when/if we get to it.
it’s quite difficult to keep up with pull requests for a popular programming language.
Quite difficult for a BDFL on their own? But perhaps with trusted maintainers for subsystems, the load can be lighter? Maybe challenging for popular languages to graduate to that point (requires popularity - chicken/egg) and/or challenging for BDFLs to delegate that responsibility?
On the OCaml side, we suffer from “maintenance bottlenecks”: there are more people willing to submit PRs than people willing to review them. “Trusted maintainers” are not a magical solution for this problem. There may not be enough people, or they may not be available enough. See this dicussion thread on the OCaml Discuss.
In large part the problem comes down to the fact that in the culture of many projects and contributors, doing things yourself is more rewarding than gatekeeping other people’s work. If you are a hobby contributor, maybe you want to scratch your own itch rather than sift through CI logs to understand why someone else’s PR fails to pass. If you are paid by a company to contribute, unsurprisingly “I want feature X” is in higher demand from companies than “please release existing maintainers from a share of their review burden”, better considered by managers, etc.
Of course this is also related to quality expectations for changes. The review processes I’m talking about is designed to be conservative, ensure strict reviews for every change and err on the side of caution. This naturally generates backlog. My intuition is that for programming languages, especially those that want to provide backward compatibility, being conservative is the right default – otherwise it is easy to end up with a frankensteinan monster of broken design after a decade of so-so changes. But there are informed voices pointing out that maybe we are sometimes going too far ; see for example, in the Rust context: Stability without stressing the !@#! out which is precisely about our quality expectations and their cost.
On the OCaml side, we suffer from “maintenance bottlenecks”: there are more people willing to submit PRs than people willing to review them.
The way out of that is for the maintainers that do review other people’s contributions to prioritise reviewing contributions from people who leave meaningful feedback on other people’s submissions. That way, when you come to the other PRs, they’re likely to already have had a review from someone else and so you can spend less time on them and ignore them completely until they’ve been approved by someone you trust to do a first-pass review.
It’s a tradeoff. The more a single person can take on the ownership and maintenance burden, the more their domain enjoys a unified vision and avoids the pitfalls of “design by committee.” However, there’s obviously a limit to how much scope one person can take on. Sometimes, APIs and ABIs are as much a boundary between humans as they are between computers. Related talk: The Only Unbreakable Law - Casey Muratori
challenging for BDFLs to delegate that responsibility?
I once had to maintain a Clojure fork for 2 years, for a single triaged bugfix with a patch, waiting for it to get looked at and approved.
The Clojure core team is notoriously small, and it bottlenecks unglamorous work, since they don’t seem interested, but nobody else can do anything about it.
Interesting to see this from the perspective of Clojure, where development is incredibly slow, almost entirely in house, and optimized around Rich Hickey’s time and desires. This has led to multiple community clashes over the years as folks have tried to contribute more actively and have found that their patches can sometimes take 10 years be reviewed, let alone merged. (example of the 10 year gap which looks like it might be 11 years before it’s merged) The community tension led to Rich’s infamous “Open Source Is Not About You” rant, where he laid out that Clojure is his language and not the community’s, and that if you don’t like it, you can fuck off. (The swearing was implied, of course.)
There’s been no successful fork, partially because the Compiler is complex and the code base is messy, and partially because the backwards compatibility guarantees by the Clojure team make it hard to argue that breaking those guarantees is worth other benefits.
For what it’s worth, I think Clojure is a wonderful and deeply flawed language, and don’t expect it to ever change. At some point, someone will design a better language for me and I’ll switch. Until then, there’s not much to be gained by jumping ship to a potential fork that keeps me from the rest of the community.
I wish the best of luck to the OpenD team, hopefully they can be successful in their work.
Heh, yeah, I had to maintain a Clojure fork for two years waiting for a bugfix to get merged. Mind you, this was after it had already been triaged and had a patch.
CLJ-2065, which was a bug about reduce-kv failing on subvecs.
The issue was opened in 2016, first response from core team 10 months later, unvetted and unresolved until 2021. (It might actually be even older than 2016, since there was a JIRA migration at one point iirc.)
BTW, your link to CLJ-1162 is broken, try this one.
Disclaimer: I am a complete outsider. I know nearly nothing about the programming language D and its history.
What’s with the tone? Why do people get worked up and talk in terms of forced forks, and other hostile realities rather than just creating a fork with good sportsmanship?
To each their own. They prefer this, you prefer that… That’s great! Both can do their own thing, and who knows, perhaps in the future a large chunk of work can be incorporated into each others codebase?
Create a fork and scratch your own itch. That’s kind of the point of open source. Be thankful for being able to start from whichever point you want rather than from zero.
It seems like working with/around a frustrating BDFL for a decade and a half is frustrating. I’m an outsider to D too, but given how many examples of “XXX submits a patch, the community wants it, the PR just sits there gathering mold” are mentioned here, it seems like the tone would resonate with a lot of the intended audience of D contributors who are also fed up enough to jump onto a fork.
My reading is that the author is attempting to save the D language itself, by expressing a clear vote of no-confidence in the project leader and attempting to lure every D contributor into his fork. I don’t think there is a “good sportsmanship” approach here, and I don’t think it’s up to outsiders to police the tone.
Same thing happened with Elm lang. We need some kind of new branding for open source projects which don’t want outside contribution so it’s clear from the start whether you should submit PRs or not. I think in github it is possible to disable PRs. That would be the right thing to do if you refuse to accept them.
In github sadly it is not possible, which is pretty stupid. Recently I’ve seen a project use an auto-close bot, which is actually an infuriating experience unlike not having the PRs tab in the first place which is fine.
In forgejo/gitea you can disable PRs; I have them disabled in some of my codeberg repos.
You can create a template for PRs. This is meant to tell people how to raise PRs, but you can also set it to a big notice saying ‘This project does not accept PRs, please do not raise any!’
Upstream D does accept a fair number of PRs and idea contributions - like I said in the post, much of what makes D great has come from the community. Small bug fixes are likely to be merged in a decent time; about half of all bugs opened are resolved within a couple months.
Where the paralysis comes in is if there’s the slightest bit of question to it. The hired hands will then say it can’t happen without word from their bosses… and their bosses frequently are absent, uninterested, or just plain unknowledgeable and thus work grinds to a stop. If the lieutenants make a decision, even if it is unanimous among them, it is then subject to arbitrary reversion by the upper leadership later.
What I wished they’d do is just explicitly delegate in those cases. Say “I can’t or won’t comment on this, so I’ll trust your judgement”, or “I think you’re all wrong, but since you’re virtually unanimous, my veto is overridden”. But what more often happens is things just stall and endless circles of discussions posts happen ending either nowhere (most common) or with Walter doing his own half-baked implementation instead that fails to incorporate the work done before.
If you’re content with things the way they are or just want some little bugs fixed, upstream is fine. If you want more though, good luck, it is an uphill climb. And after spending countless hours over the years trying, including several more pushes at the end of 2023 that also proved fruitless, I’m just fed up with it. I tried to be diplomatic in the blog, but I’m sure a bit of the underlying frustration came through anyway.
I haven’t worked with D since the the old Phobos/Tango days.
Have two “standard libraries” was weird and something I never understood. I guess this explains it a little.
I always wondered why D never took off back then. It offered a lot of convenience over C/C++ and a lot of performance in a lot of cases over Java. It seemed like a very neat language that could cover the majority of what C/C++ was being used for back then, and some of the Java stuff better than those languages did.
I first tried D in 2006, and got really into it by 2007, right about the time it officially got its 1.0 and 2.0 labels. (D 1.0 was tagged in January 2007 and D 2.0 was tagged in June 2007.) I heard about it because I was a use of Digital Mars C++ previously and wanted to check for updates out of nostalgia more than anything else (I used DMC++ to make DOS games, and 16 bit support not that important from a practical matter by 2006 lol), and saw the D link and was like “oh yeah i should try that some day”… and then was like “well, today is a some day”.
I very quickly found it to be fantastic - most of what I liked about C and C++, but also most of what I liked about PHP and Javascript… all in one language. I still think this is D’s biggest strength: being able to blend these higher and lower level concepts in one place pretty seamlessly (though the compile-time reflection thing is a competitor for biggest strength, it is only useful because you can use it with that blend of code; if I could reflect over structs but strings were still hard to use, it’d fall short of PHP’s strengths).
Anyway, I can’t really speak to its popularity back then on an absolute scale. Certainly seemed active enough to me, but without anything to compare it against, that’s not saying much. What I can compare is relative activity in the years since.
Activity in the D channels went up, hitting a peak somewhere around 2013, 2014ish, and then went back down. I think part of the trouble is D keeps chasing after the next trend, and… never quite delivering. There was a big push toward becoming thread friendly, about the time Go was making waves: immutable variables added, shared type added, pure functions added, std.concurrency and std.parallelism added. But even today, most of these don’t really hit the mark. shared is still not well defined for what it is actually supposed to do. immutable works well for strings, but feels half finished for anything else. Pure is kinda elegant on paper but hasn’t actually changed anything in practice. Etc.
Then Rust came out and D had to chase this. The memory safe subset was already a thing before, but then it got the focus of attention. All the half-finished thread stuff just dropped, now it is all about nogc betterC ownership/borrowing. None of this delivers a compelling experience on its own, and especially not when you could just use Rust itself.
Of course, the last few years, the big focus has shifted to writing yet another C compiler embedded inside D. And again, it has some potential but just seems to sit forever half-finished, with identified flaws just going unaddressed.
Mind you, this half-finished collection of things isn’t bad… it just isn’t great. So it makes it hard to justify picking good-but-not-great parts of D when you can have those more mainstream alternatives available. Even if they’re not great either, they’re going to feel safer by virtue of just being more popular. Meanwhile, the parts of D that most of us users find most compelling are just put on indefinite hold.
But remember, this is all counterfactual speculation, I’m sure if you asked a bunch of other people you’d get a bunch of other answers, and we can’t exactly peer into an alternate timeline to see what would have actually happened.
Is it fair to say that OpenD is going to be chasing more after Go, python, .NET, etc. in terms of niche? Use cases that balance productivity and performance, rather than fully focusing on low-level performance as betterC would. Embracing the GC sounds like it’ll keep away those who’d pick C++ or Rust currently.
This is not a criticism, I’m just curious. Natively compiled, fairly fast, GC’d languages is a space which has not been saturated yet imho. Go is certainly the biggest contender in this space right now (I use OCaml which is quite small but does live in that space too). OpenD could be a more powerful/featureful Go. That means that you also need to have a good concurrency story that can use multiple cores efficiently, including having a GC that plays well with that.
Back in the day before betterC was a thing (if you search the archives of my blog you can find some of my early statements on that too but meh not that important), you could still do all the same things we call betterC today. It actually wasn’t hard either; my DConf 2014 talk was about a bare metal D implementation I toyed with (though I actually did implement exceptions). I also have a stripped, customized runtime for webassembly more recently. (btw my work on these formed the starting points of the PowerNex kernel project and Hipreme game library, respectively)
I have no intention of breaking these things. But I also don’t want them to distract from other work; “embrace the GC” means if a feature requires GC, that’s fine. If it needs a better GC, let’s see if we can fix the implementation so it can do it (easier said than done, I know. And worth noting there is someone working on this upstream and his work looks very promising). Similarly, “dropping focus on betterC” doesn’t mean what you’re doing there will stop working. It just means it is more back to how it used to be: I hope you find it useful, but it is provided without warranty; you’re off the beaten path and can no longer expect official special treatment from the language itself.
What about concurrency? In particular, threading and lightweight fibers/coroutines, or whatever enables cooperative concurrency? I think it’s becoming increasingly hard to compete with Go if you don’t have a solid story in that department.
I personally looked at D a few years ago (mostly by curiosity, not need) and I was a bit uneasy at the apparent lack of a stability policy. Go, Rust, etc. have a very overt policy that they never break user code[^1] and I didn’t see anything similar for D. The other thing that set D back in my eyes (as a OCaml user, note), is the lackluster support for sum types. A library-only support, like C++‘s std::variant, can’t be as good as something supported by the language.
I’d be curious to hear your thoughts on that :)
[^1]: with specific caveats for security or actual bugs but it’s really strong policies overall.
D’s core runtime lib has… building blocks for threads and fibers, though the default fibers are a bit heavyweight since they are general-purpose use and thus save/swap more context than is likely needed. Still, they’re perfectly serviceable. What I think is missing is a standardized higher level library interface. The std.concurrency module is one of those “good-but-not-great” things and it doesn’t actually interop with much … at all. If you use it and just it, it is ok, but try to mix with platform i/o or window events or something and you get trouble. I have ideas to improve on this, mostly through library solutions, but it’ll take time to get to a usable state.
a bit uneasy at the apparent lack of a stability policy
Indeed, this is something that really came up a lot in 2023 - they have been saying for years a key goal is to stabilize the language, yet there’s a random breakage every few months, and changing definitions of the spec (in particular, the meaning of the in keyword has changed i think 3 times in 3 years). Most these things relate to the push for memory safety, but the implementations have not consistently sparked joy. (They sometimes have though!)
There was one particular feature last year that got deprecated just because leadership wasn’t interested in defining its corner cases; they kinda regretted ever adding it and thus would sit on proposals to fix it (many of which would be very non-invasive, such as just changing the spec to match the implementation) and chose to deprecate its use in such cases entirely instead. Well, this broke one moderately popular library and annoyed its maintainer to the point where it made a facical twitter video showing like 30 seconds of the compiler spewing deprecation messages. (Mind you, it did print tens of thousands of messages, but they actually sourced back to just a couple hundred lines of code… but this deprecation had no migration path at all, so even if it was just one line of code, it’d leave you wondering: what do you do? Redesign the library and break all its downstream users? Cease support and tell them the max supported compiler is version X? This is no obvious answer and upstream didn’t even try to provide one.)
Anyway, the farcical video and social media laughs got the attention of leadership who then decreed: that deprecation would be reverted and no more changes to the language. Ever. All code currently on the internet must continue to compile, unmodified, for perpetuity. Well, ok, we will want to add and change things, so the deputy leader got tasked with writing a proposal to add “editions” to the compiler, so you can opt into changes. But until this is done, no more changes! No additions either, until editions.
….then the next release broke things again. Minor breakage, reverted in a later release, but still, it happened and it shook confidence in the stability promise again. And the editions proposal has been pending for months. So it feels like yet another thing that gets talked about, but action is stalled indefinitely.
That said, however, I really think these worries of breakage are overblown. I have code I wrote in 2007 that still works fine. Some 6% of my code has never seen a change in git at all; git blame still points back to “initial commit” back in 2011. There is a core to the language that is pretty reliable, and you can be very productive with it. When you update, you’re more often to see library breakage than language breakage, unless you try to use some of the newer features, then you’re mileage varies. But avoid the bleeding edge and tbh you’re probably ok. I - and most people who answered the old community surveys - feel stagnation is a bigger risk than breakage.
As for the OpenD policy (btw I don’t really want to be a new dictator, but it is basically me calling the shots rn, my proposal for here is the same as what I have been telling upstream for years - some kind of representative steering committee should make the big decisions, the delegate the specifics to key maintainers, but at the moment, I’m a temporary president to usher in a transition to democracy… eventually… lol, but I always caveat that my opinion is not necessarily the final opinion of the project), is at that same blog link above, following the GC section: you have to weigh the costs and benefits of each decision. Minimize breakage, but if something must be done for a greater good, go ahead and break it - just make sure the migration path is as frictionless and well-documented as possible, and try to keep to the principles of progressive enhancement and graceful degradtion; if a user doesn’t update their compiler but the library author does, they should both be ok.
Remember, I personally maintain a quarter million lines of D code in my own open source library with my own long term commitment, including the library still working whether use version 2.088 or 2.105 (or older sometimes, but my general policy is to test on a 3 year old compiler)… and with this fork, I’m now responsible for almost a full million lines including that… and I also have commercial support commitments… so I have a decent amount of personal incentive to favor stability. A bug we know how to work around might be better than a fix that brings unknown risks. But …. sometimes we just plain want it fixed. Hence case by case weighing.
I started D back in 2006, getting more serious in 2007. I first subscribed to the D mailing lists and joined the chatroom in about 2010. Like I said in another comment here, since then, I’ve kept up with almost everything. There’s 2,642 all-time questions on Stack Overflow tagged D. About 704 of them were created prior to June 2012, when I joined SO. Since then, I answered 369 of them myself, so 19% of the total since I joined. (Note: 1443 of SO questions came between June 2012 and December 2016; only 495 total questions since 2017. Of course, Stack Overflow as a whole has been declining lately. I’m using it mostly because it is an easy database to query than anything else!) Both the #1 and #2 IRC users by post count are me: my old nick and my new nick. It is harder to say for sure on the other sites as SO, but my activity on all of them is likely about proportionally the same; I’ve probably personally answered about 1/6 of all new-user D questions over the last decade. My “D Cookbook” started as a compilation of many of these answers, and I also frequently help other experienced D users with their problems as well.
In addition to supporting users of all skill levels, I also have taken an active part in a great many design discussions and implemented several things myself; most these discussions have been a frustrating waste of time, but nevertheless, you’ll find some of some my code in the compiler and standard library among other places.
I’ve spoken at two in-person DConfs and two online presentations. My blog also has over 400 articles on D (though many of them are not especially high effort, so that number is padded a bit, many of them are).
I also maintain just shy of a quarter million lines of D library code (~50k of that is user contribution, the other 200k are my own creation since 2008) spanning 70 distinct areas - a significant fraction (again about 1/6th is a fair approximation) of the whole library ecosystem. Indeed, several other packages still have copy/pasted code of mine.
I didn’t expect to see my blog on this website lol, I mostly write it for the handful of regular readers who all know me pretty well already. But if you click down the other articles you can get an idea what I do.
Speaking from experience, it’s quite difficult to keep up with pull requests for a popular programming language. It seems like the author of this blog post is implying that they will somehow be able to stay more on top of it than Walter. It’s not clear to me, and it doesn’t seem to be discussed in the blog post, what their plan is for making that happen. However, it’s a noble goal, and I sincerely wish them luck.
I already kept up with pull requests from upstream. I looked at every one of them, read every forum post and every bug report. It isn’t actually that busy (perhaps D isn’t a popular programming language? idk). Typical week would have ~30ish PRs and < 10 bugs. Most the PRs are utterly trivial and most from the same people; if you want to look it over, it takes a couple minutes and if you don’t, you can realistically trust the authors to have done their job and just automatically hit merge based on who wrote it.
Of course, if it really took off and had hundreds a day, that might be different, but as it is right now, keeping up isn’t difficult at all. Cross the other bridge when/if we get to it.
Quite difficult for a BDFL on their own? But perhaps with trusted maintainers for subsystems, the load can be lighter? Maybe challenging for popular languages to graduate to that point (requires popularity - chicken/egg) and/or challenging for BDFLs to delegate that responsibility?
On the OCaml side, we suffer from “maintenance bottlenecks”: there are more people willing to submit PRs than people willing to review them. “Trusted maintainers” are not a magical solution for this problem. There may not be enough people, or they may not be available enough. See this dicussion thread on the OCaml Discuss.
In large part the problem comes down to the fact that in the culture of many projects and contributors, doing things yourself is more rewarding than gatekeeping other people’s work. If you are a hobby contributor, maybe you want to scratch your own itch rather than sift through CI logs to understand why someone else’s PR fails to pass. If you are paid by a company to contribute, unsurprisingly “I want feature X” is in higher demand from companies than “please release existing maintainers from a share of their review burden”, better considered by managers, etc.
Of course this is also related to quality expectations for changes. The review processes I’m talking about is designed to be conservative, ensure strict reviews for every change and err on the side of caution. This naturally generates backlog. My intuition is that for programming languages, especially those that want to provide backward compatibility, being conservative is the right default – otherwise it is easy to end up with a frankensteinan monster of broken design after a decade of so-so changes. But there are informed voices pointing out that maybe we are sometimes going too far ; see for example, in the Rust context: Stability without stressing the !@#! out which is precisely about our quality expectations and their cost.
The way out of that is for the maintainers that do review other people’s contributions to prioritise reviewing contributions from people who leave meaningful feedback on other people’s submissions. That way, when you come to the other PRs, they’re likely to already have had a review from someone else and so you can spend less time on them and ignore them completely until they’ve been approved by someone you trust to do a first-pass review.
It’s a tradeoff. The more a single person can take on the ownership and maintenance burden, the more their domain enjoys a unified vision and avoids the pitfalls of “design by committee.” However, there’s obviously a limit to how much scope one person can take on. Sometimes, APIs and ABIs are as much a boundary between humans as they are between computers. Related talk: The Only Unbreakable Law - Casey Muratori
I once had to maintain a Clojure fork for 2 years, for a single triaged bugfix with a patch, waiting for it to get looked at and approved.
The Clojure core team is notoriously small, and it bottlenecks unglamorous work, since they don’t seem interested, but nobody else can do anything about it.
Interesting to see this from the perspective of Clojure, where development is incredibly slow, almost entirely in house, and optimized around Rich Hickey’s time and desires. This has led to multiple community clashes over the years as folks have tried to contribute more actively and have found that their patches can sometimes take 10 years be reviewed, let alone merged. (example of the 10 year gap which looks like it might be 11 years before it’s merged) The community tension led to Rich’s infamous “Open Source Is Not About You” rant, where he laid out that Clojure is his language and not the community’s, and that if you don’t like it, you can fuck off. (The swearing was implied, of course.)
There’s been no successful fork, partially because the Compiler is complex and the code base is messy, and partially because the backwards compatibility guarantees by the Clojure team make it hard to argue that breaking those guarantees is worth other benefits.
For what it’s worth, I think Clojure is a wonderful and deeply flawed language, and don’t expect it to ever change. At some point, someone will design a better language for me and I’ll switch. Until then, there’s not much to be gained by jumping ship to a potential fork that keeps me from the rest of the community.
I wish the best of luck to the OpenD team, hopefully they can be successful in their work.
Heh, yeah, I had to maintain a Clojure fork for two years waiting for a bugfix to get merged. Mind you, this was after it had already been triaged and had a patch.
Oh dang, which bug fix?
CLJ-2065, which was a bug about reduce-kv failing on subvecs.
The issue was opened in 2016, first response from core team 10 months later, unvetted and unresolved until 2021. (It might actually be even older than 2016, since there was a JIRA migration at one point iirc.)
BTW, your link to CLJ-1162 is broken, try this one.
Disclaimer: I am a complete outsider. I know nearly nothing about the programming language D and its history.
What’s with the tone? Why do people get worked up and talk in terms of forced forks, and other hostile realities rather than just creating a fork with good sportsmanship? To each their own. They prefer this, you prefer that… That’s great! Both can do their own thing, and who knows, perhaps in the future a large chunk of work can be incorporated into each others codebase?
Create a fork and scratch your own itch. That’s kind of the point of open source. Be thankful for being able to start from whichever point you want rather than from zero.
It seems like working with/around a frustrating BDFL for a decade and a half is frustrating. I’m an outsider to D too, but given how many examples of “XXX submits a patch, the community wants it, the PR just sits there gathering mold” are mentioned here, it seems like the tone would resonate with a lot of the intended audience of D contributors who are also fed up enough to jump onto a fork.
My reading is that the author is attempting to save the D language itself, by expressing a clear vote of no-confidence in the project leader and attempting to lure every D contributor into his fork. I don’t think there is a “good sportsmanship” approach here, and I don’t think it’s up to outsiders to police the tone.
Same thing happened with Elm lang. We need some kind of new branding for open source projects which don’t want outside contribution so it’s clear from the start whether you should submit PRs or not. I think in github it is possible to disable PRs. That would be the right thing to do if you refuse to accept them.
In github sadly it is not possible, which is pretty stupid. Recently I’ve seen a project use an auto-close bot, which is actually an infuriating experience unlike not having the PRs tab in the first place which is fine.
In forgejo/gitea you can disable PRs; I have them disabled in some of my codeberg repos.
You can create a template for PRs. This is meant to tell people how to raise PRs, but you can also set it to a big notice saying ‘This project does not accept PRs, please do not raise any!’
Upstream D does accept a fair number of PRs and idea contributions - like I said in the post, much of what makes D great has come from the community. Small bug fixes are likely to be merged in a decent time; about half of all bugs opened are resolved within a couple months.
Where the paralysis comes in is if there’s the slightest bit of question to it. The hired hands will then say it can’t happen without word from their bosses… and their bosses frequently are absent, uninterested, or just plain unknowledgeable and thus work grinds to a stop. If the lieutenants make a decision, even if it is unanimous among them, it is then subject to arbitrary reversion by the upper leadership later.
What I wished they’d do is just explicitly delegate in those cases. Say “I can’t or won’t comment on this, so I’ll trust your judgement”, or “I think you’re all wrong, but since you’re virtually unanimous, my veto is overridden”. But what more often happens is things just stall and endless circles of discussions posts happen ending either nowhere (most common) or with Walter doing his own half-baked implementation instead that fails to incorporate the work done before.
If you’re content with things the way they are or just want some little bugs fixed, upstream is fine. If you want more though, good luck, it is an uphill climb. And after spending countless hours over the years trying, including several more pushes at the end of 2023 that also proved fruitless, I’m just fed up with it. I tried to be diplomatic in the blog, but I’m sure a bit of the underlying frustration came through anyway.
I haven’t worked with D since the the old Phobos/Tango days.
Have two “standard libraries” was weird and something I never understood. I guess this explains it a little.
I always wondered why D never took off back then. It offered a lot of convenience over C/C++ and a lot of performance in a lot of cases over Java. It seemed like a very neat language that could cover the majority of what C/C++ was being used for back then, and some of the Java stuff better than those languages did.
I first tried D in 2006, and got really into it by 2007, right about the time it officially got its 1.0 and 2.0 labels. (D 1.0 was tagged in January 2007 and D 2.0 was tagged in June 2007.) I heard about it because I was a use of Digital Mars C++ previously and wanted to check for updates out of nostalgia more than anything else (I used DMC++ to make DOS games, and 16 bit support not that important from a practical matter by 2006 lol), and saw the D link and was like “oh yeah i should try that some day”… and then was like “well, today is a some day”.
I very quickly found it to be fantastic - most of what I liked about C and C++, but also most of what I liked about PHP and Javascript… all in one language. I still think this is D’s biggest strength: being able to blend these higher and lower level concepts in one place pretty seamlessly (though the compile-time reflection thing is a competitor for biggest strength, it is only useful because you can use it with that blend of code; if I could reflect over structs but strings were still hard to use, it’d fall short of PHP’s strengths).
Anyway, I can’t really speak to its popularity back then on an absolute scale. Certainly seemed active enough to me, but without anything to compare it against, that’s not saying much. What I can compare is relative activity in the years since.
Activity in the D channels went up, hitting a peak somewhere around 2013, 2014ish, and then went back down. I think part of the trouble is D keeps chasing after the next trend, and… never quite delivering. There was a big push toward becoming thread friendly, about the time Go was making waves: immutable variables added,
sharedtype added, pure functions added, std.concurrency and std.parallelism added. But even today, most of these don’t really hit the mark. shared is still not well defined for what it is actually supposed to do. immutable works well for strings, but feels half finished for anything else. Pure is kinda elegant on paper but hasn’t actually changed anything in practice. Etc.Then Rust came out and D had to chase this. The memory safe subset was already a thing before, but then it got the focus of attention. All the half-finished thread stuff just dropped, now it is all about nogc betterC ownership/borrowing. None of this delivers a compelling experience on its own, and especially not when you could just use Rust itself.
Of course, the last few years, the big focus has shifted to writing yet another C compiler embedded inside D. And again, it has some potential but just seems to sit forever half-finished, with identified flaws just going unaddressed.
Mind you, this half-finished collection of things isn’t bad… it just isn’t great. So it makes it hard to justify picking good-but-not-great parts of D when you can have those more mainstream alternatives available. Even if they’re not great either, they’re going to feel safer by virtue of just being more popular. Meanwhile, the parts of D that most of us users find most compelling are just put on indefinite hold.
But remember, this is all counterfactual speculation, I’m sure if you asked a bunch of other people you’d get a bunch of other answers, and we can’t exactly peer into an alternate timeline to see what would have actually happened.
Is it fair to say that OpenD is going to be chasing more after Go, python, .NET, etc. in terms of niche? Use cases that balance productivity and performance, rather than fully focusing on low-level performance as betterC would. Embracing the GC sounds like it’ll keep away those who’d pick C++ or Rust currently.
This is not a criticism, I’m just curious. Natively compiled, fairly fast, GC’d languages is a space which has not been saturated yet imho. Go is certainly the biggest contender in this space right now (I use OCaml which is quite small but does live in that space too). OpenD could be a more powerful/featureful Go. That means that you also need to have a good concurrency story that can use multiple cores efficiently, including having a GC that plays well with that.
Yeah, I think that’s fair. If you look at the next article after the linked one (so here: https://dpldocs.info/this-week-in-d/Blog.Posted_2024_01_08.html#gc ) i went into a bit more detail about the GC statement.
Back in the day before betterC was a thing (if you search the archives of my blog you can find some of my early statements on that too but meh not that important), you could still do all the same things we call betterC today. It actually wasn’t hard either; my DConf 2014 talk was about a bare metal D implementation I toyed with (though I actually did implement exceptions). I also have a stripped, customized runtime for webassembly more recently. (btw my work on these formed the starting points of the PowerNex kernel project and Hipreme game library, respectively)
I have no intention of breaking these things. But I also don’t want them to distract from other work; “embrace the GC” means if a feature requires GC, that’s fine. If it needs a better GC, let’s see if we can fix the implementation so it can do it (easier said than done, I know. And worth noting there is someone working on this upstream and his work looks very promising). Similarly, “dropping focus on betterC” doesn’t mean what you’re doing there will stop working. It just means it is more back to how it used to be: I hope you find it useful, but it is provided without warranty; you’re off the beaten path and can no longer expect official special treatment from the language itself.
What about concurrency? In particular, threading and lightweight fibers/coroutines, or whatever enables cooperative concurrency? I think it’s becoming increasingly hard to compete with Go if you don’t have a solid story in that department.
I personally looked at D a few years ago (mostly by curiosity, not need) and I was a bit uneasy at the apparent lack of a stability policy. Go, Rust, etc. have a very overt policy that they never break user code[^1] and I didn’t see anything similar for D. The other thing that set D back in my eyes (as a OCaml user, note), is the lackluster support for sum types. A library-only support, like C++‘s
std::variant, can’t be as good as something supported by the language.I’d be curious to hear your thoughts on that :)
[^1]: with specific caveats for security or actual bugs but it’s really strong policies overall.
D’s core runtime lib has… building blocks for threads and fibers, though the default fibers are a bit heavyweight since they are general-purpose use and thus save/swap more context than is likely needed. Still, they’re perfectly serviceable. What I think is missing is a standardized higher level library interface. The std.concurrency module is one of those “good-but-not-great” things and it doesn’t actually interop with much … at all. If you use it and just it, it is ok, but try to mix with platform i/o or window events or something and you get trouble. I have ideas to improve on this, mostly through library solutions, but it’ll take time to get to a usable state.
Indeed, this is something that really came up a lot in 2023 - they have been saying for years a key goal is to stabilize the language, yet there’s a random breakage every few months, and changing definitions of the spec (in particular, the meaning of the
inkeyword has changed i think 3 times in 3 years). Most these things relate to the push for memory safety, but the implementations have not consistently sparked joy. (They sometimes have though!)There was one particular feature last year that got deprecated just because leadership wasn’t interested in defining its corner cases; they kinda regretted ever adding it and thus would sit on proposals to fix it (many of which would be very non-invasive, such as just changing the spec to match the implementation) and chose to deprecate its use in such cases entirely instead. Well, this broke one moderately popular library and annoyed its maintainer to the point where it made a facical twitter video showing like 30 seconds of the compiler spewing deprecation messages. (Mind you, it did print tens of thousands of messages, but they actually sourced back to just a couple hundred lines of code… but this deprecation had no migration path at all, so even if it was just one line of code, it’d leave you wondering: what do you do? Redesign the library and break all its downstream users? Cease support and tell them the max supported compiler is version X? This is no obvious answer and upstream didn’t even try to provide one.)
Anyway, the farcical video and social media laughs got the attention of leadership who then decreed: that deprecation would be reverted and no more changes to the language. Ever. All code currently on the internet must continue to compile, unmodified, for perpetuity. Well, ok, we will want to add and change things, so the deputy leader got tasked with writing a proposal to add “editions” to the compiler, so you can opt into changes. But until this is done, no more changes! No additions either, until editions.
….then the next release broke things again. Minor breakage, reverted in a later release, but still, it happened and it shook confidence in the stability promise again. And the editions proposal has been pending for months. So it feels like yet another thing that gets talked about, but action is stalled indefinitely.
That said, however, I really think these worries of breakage are overblown. I have code I wrote in 2007 that still works fine. Some 6% of my code has never seen a change in git at all; git blame still points back to “initial commit” back in 2011. There is a core to the language that is pretty reliable, and you can be very productive with it. When you update, you’re more often to see library breakage than language breakage, unless you try to use some of the newer features, then you’re mileage varies. But avoid the bleeding edge and tbh you’re probably ok. I - and most people who answered the old community surveys - feel stagnation is a bigger risk than breakage.
As for the OpenD policy (btw I don’t really want to be a new dictator, but it is basically me calling the shots rn, my proposal for here is the same as what I have been telling upstream for years - some kind of representative steering committee should make the big decisions, the delegate the specifics to key maintainers, but at the moment, I’m a temporary president to usher in a transition to democracy… eventually… lol, but I always caveat that my opinion is not necessarily the final opinion of the project), is at that same blog link above, following the GC section: you have to weigh the costs and benefits of each decision. Minimize breakage, but if something must be done for a greater good, go ahead and break it - just make sure the migration path is as frictionless and well-documented as possible, and try to keep to the principles of progressive enhancement and graceful degradtion; if a user doesn’t update their compiler but the library author does, they should both be ok.
Remember, I personally maintain a quarter million lines of D code in my own open source library with my own long term commitment, including the library still working whether use version 2.088 or 2.105 (or older sometimes, but my general policy is to test on a 3 year old compiler)… and with this fork, I’m now responsible for almost a full million lines including that… and I also have commercial support commitments… so I have a decent amount of personal incentive to favor stability. A bug we know how to work around might be better than a fix that brings unknown risks. But …. sometimes we just plain want it fixed. Hence case by case weighing.
Who is running this? There is a lot of talk about how they’re part of the D community and they’ve submitted patches, etc. etc., but who are they?
I started D back in 2006, getting more serious in 2007. I first subscribed to the D mailing lists and joined the chatroom in about 2010. Like I said in another comment here, since then, I’ve kept up with almost everything. There’s 2,642 all-time questions on Stack Overflow tagged D. About 704 of them were created prior to June 2012, when I joined SO. Since then, I answered 369 of them myself, so 19% of the total since I joined. (Note: 1443 of SO questions came between June 2012 and December 2016; only 495 total questions since 2017. Of course, Stack Overflow as a whole has been declining lately. I’m using it mostly because it is an easy database to query than anything else!) Both the #1 and #2 IRC users by post count are me: my old nick and my new nick. It is harder to say for sure on the other sites as SO, but my activity on all of them is likely about proportionally the same; I’ve probably personally answered about 1/6 of all new-user D questions over the last decade. My “D Cookbook” started as a compilation of many of these answers, and I also frequently help other experienced D users with their problems as well.
In addition to supporting users of all skill levels, I also have taken an active part in a great many design discussions and implemented several things myself; most these discussions have been a frustrating waste of time, but nevertheless, you’ll find some of some my code in the compiler and standard library among other places.
I’ve spoken at two in-person DConfs and two online presentations. My blog also has over 400 articles on D (though many of them are not especially high effort, so that number is padded a bit, many of them are).
I also maintain just shy of a quarter million lines of D library code (~50k of that is user contribution, the other 200k are my own creation since 2008) spanning 70 distinct areas - a significant fraction (again about 1/6th is a fair approximation) of the whole library ecosystem. Indeed, several other packages still have copy/pasted code of mine.
I didn’t expect to see my blog on this website lol, I mostly write it for the handful of regular readers who all know me pretty well already. But if you click down the other articles you can get an idea what I do.
@adam_d_ruppe
The name of the language is right there in his name! Such strong credentials!
Jokes, but Adam’s been involved a long time and wrote a book on the language.