Lots of small language-level improvements, but there don’t seem to be that many fundamental compiler architecture issues mentioned in the changelog. I may be missing those since I haven’t used Nim for a long time, but I assume if the compiler was made 10x less janky they would’ve at least had a footnote about it. (For context, I’ve used Nim as my main language in 2019-2020.)
I don’t like that there still isn’t my #1 most awaited feature - incremental compilation. Nim isn’t a very fast language to compile and the slowness really showed with some of my projects. In particular rapid, my game engine, suffered from pretty terrible compile times, which made building games with it really difficult.
And I wonder how much generics jank is still present in 2.0. And I wonder when they’ll make generics into proper generics rather than C++-like templates. The error messages with those can get pretty abysmal and can appear out of nowhere. Like here. This particular cryptic error has appeared so many times while I was developing my game that I just gave up on the project eventually.
In addition to that I’m not a fan of how lame the module system is. Having used/seen module systems from other languages, eg. Rust or Go, Nim’s compiler not having a clue about the concept of packages feels pretty ancient. This particularly shows when you’re developing libraries made out of many small modules; the only way to typecheck such a library in its entirety is to create a library.nim file and import everything there. Text editors didn’t seem to understand that back in the 1.4 days and would regularly miss type errors that occurred when analyzing the library as a whole.
Oh, and the text editor situation… Nim’s compiler does not have incremental recompilation, so the autocomplete gets slower the larger your project is. And it can get really slow for even pretty small projects (~10k SLOC.)
And don’t get me started on the dialects. Endless --define switches and experimental features. And none of it is implemented in a robust way. Anyone can break your library by globally enabling a --define you did not anticipate. And the --defines are not even documented in one place.
So sad to see Nim’s leadership pursuing syntax sugar and small convenience features instead of fixing foundational problems. Really wish they had a more forward-looking vision for the project and its community, rather than focusing on fulfilling the main developer’s wishes and experiments.
The Nim leadership is the main developer, Andreas. He’s not interested in sharing responsibility or broadening the leadership, as he vehemently expressed a month ago:
I lost all interest in setting up a foundation because I lost faith in mankind. Every day I wake up in this clownworld where people replaced the “master” branch with “main”, sending a strong message “fuck you” to everybody who is older than 50 and has a hard time to change old habits.
Here is a hint: If you are obsessed with racism and sexism it’s because you’re a racist and sexist.
That was the point where I gave up on Nim. I don’t know where to start with this — it’s laughable that he pulls out that silly fight about master/main as his breaking point; he whines about society changing and it’s their fault he might have to “change old habits”, and that tired canard about racism/sexism. (He also appears to have deleted the comments objecting to his post, though not the supportive ones. Because of course he’s the forum moderator too.)
But my main takeaway is that the guy’s even more of an asshat than I previously thought, and he’s going to remain the gatekeeper to any change in Nim, and main source of truth on the forum. I just didn’t want to deal with that anymore. I’d rather fight with a borrow-checker.
I’ve seen his comment, yeah. It’s informative and unfortunate.
I’ve honestly been tempted to write a “why not Nim” blog post for a couple years now but never got around to doing so because a) I don’t like spreading negativity, and b) I’d rather not attract publicity to a project whose success I don’t exactly believe in.
Bad opinions aside, I believe Araq’s lack of care for the community is precisely the reason why the project is going in the wrong direction. I’ve heard horror stories from former compiler contributors about how hard to maintain the code is and how much it lacks documentation. No wonder it doesn’t attract very many outside contributions. Had he cared more for having other people work on the language alongside him, maybe things would have turned out different, but alas…
This sort of dictatorship is not the sort of leadership of a project I’d like to invest my time in. I much prefer the Rust RFC process over this.
Woah, I didn’t expect so much negativity in this thread… I was kind hoping to see some interesting discussions and maybe even some praises for a language that reached its 2.0.0 milestone without the backing of any tech giant.
Sure, the language is probably still not perfect, and at least some of @liquidev’s remarks make sense… but it is a remarkable milestone nonetheless.
I have been using Nim for years mostly on my personal projects (I kinda built my own ecosystem on top of it), and I must say it is fun to use. And it is very well documented. Unfortunately it feels very much a fringe language because it didn’t see massive corporate adoption (yet) but I hope this can change sooner or later.
About Araq: the guy can be rude at times, maybe even borderline unprofessional in some of his replies but he did put a lot of energy into the project over the years, and I am grateful for that. I tend not to get too involved in politics or heated debates… I saw that reply and it struck me as “a bit odd” and definitely not good marketing for the language and the community. I just hope that doesn’t drive too many people away from the language; it would be a pity.
that doesn’t drive too many people away from the language
Well it’s one thing to wish for some features, it’s another to wish for a leadership that doesn’t have a personal mental breakdown in the official forums - attacking a multitude of people - and deletes any critical response. The second one can’t just be ignored.
And if rust is already struggling with compile times, I wonder how bad this is with something that doesn’t even know about incremental compilation. You can’t just ignore a debugging round-trip time of minutes.
You can ask people for being less negative or strict, but first: don’t forget it’s v2.0 and second: the other way of not complaining about real production problems is to say nothing and move on.
I’m sorry if my comment came off as rude or overly negative… I don’t mean to ruin the celebration; as a long time user I’m just trying to voice my concerns about the direction the language is taking, and I think it’s important to talk about these rather than keep quiet about them forever and instead create an atmosphere of toxic positivity. 2.0 is considered by many a huge milestone and seeing important issues which put me off from using the language not be addressed in a major version is pretty disappointing.
Perhaps this speaks of our expectations of major versions; I see them as something that should be pretty big while in real life often it’s just some small but breaking changes. I’m of the firm belief that perhaps Nim went into 1.0 too early for its own good, because inevitably there will be breaking changes (and with how unstable the compiler can be, there can be breakages even across minor versions.)
I’ll be this person and ask you why you don’t come to D? There is a fundation and the tone is very respectful.
It is a main inspiration for Nim, actually Araq spent many years reading and commenting on the D forums. D pioneered many things that went into Nim, but the core language is very stable and there is no compiler switch explosion. In many ways D is more further along than Nim with its 3 compilers and it supports internal controversy and I’d say sane debate inside its community. I do see a bit of FUD about D on the internet, often by echo’ing a single negative opinion in a majority of content programmers. Sometimes I think it’s down to syntax (Python-like vs C-like).
Agree. I also use D and have since… looks at personal repos… 2014 or 2015 but maybe earlier and started doing some toys in Nim around 2018. What D lacks is buzz. It’s mature, stable, and performant and, at least for me, doesn’t break between upgrades. Some corners of D like CTE and nested templates I find hard to debug (and this is true for other languages, but that’s not a free pass) but they work. I keep finding bits of Nim and core/former-core libraries where that’s not the case and they fail in odd ways and that’s still true in 2.0.
As bizarre as that is, Araq’s use of the phrase “clown world” is more indicative of future behaviour than random Rust community members talking about pronouns. Here’s another strange Araq post - I wouldn’t want to support a project with this kind of world view.
Maybe also because that analogy argument was inside one issue, opened specifically to bikeshed it. The other one felt more like a dismissal of anything that isn’t in his view of the world - in a discussion about giving the community a chance to steer the direction of the language.
I’d happily take that over Araq’s bullshit, like when I pointed out that Nim’s null-checking was leading to bogus errors in a bunch of my code (after hours of debugging and creating a reduced test case) he dismissed it with “that’s because the control flow analysis doesn’t notice ‘return’ statements, and you shouldn’t be using return because it isn’t in Pascal.” Despite him having put both features in the language.
Oh? I recall similar arguments being used against Jews.
It’s a fairly obvious logic fallacy, which anyone smart enough to be a programmer ought to see through pretty easily. (Hint: if you deny a > b, it does not follow you believe b > a.)
He also appears to have deleted the comments objecting to his post
Although I agree with almost all of your points and came to the same conclusion, I think it’s fair to say that not all critical comments were deleted. There are several in the thread that you linked.
The comments do show that at least one comment was removed. I don’t know if there were more now-removed comments because I read the thread only a while after it was closed.
After trying Nim for a little while some time ago, the module system is what soured me on the language. I don’t like that you can’t tell where a symbol comes from by looking at the source of the current file. It’s “clever” in that you automatically get things like the right $ function for whatever types get imported by your imports, but that’s less valuable than explicitness.
On the contrary, I actually don’t hold much grudge against the import system’s use of the global namespace. Static typing and procedure overloading ensures you get to call the procedure you wanted, and I’ve rarely had any ambiguity problems (and then the compiler gives you an error which you can resolve by explicitly qualifying the symbol.) While coding Rust, I almost never look at where a symbol comes from because I have the IDE for browsing code and can Ctrl+Click on the relevant thing to look at its sources.
My main grudge is that the module system has no concept of a package or main file, which would hugely simplify the logic that’s needed to discover a library’s main source file and run nim check on it. Right now text editors need to employ heuristics to discover which file should be nim check’d, which is arguably not ideal in a world where developers typically intend just a single main file.
I’ve kinda done something like this over the years, only less purposefully. And I thought I’d settled on Nim, but I rage-quit it a few months ago after a particularly egregious “it’s my language and I can put in any footguns I want*” decree by the Benevolent Dictator-For-Life on the forum.
Anyway for now I’ve ended up at Swift. (And yet I keep coding in C++ because Reasons. Sigh.)
I wonder why the Swift version is so slow. Arithmetic is overflow-checked by default, and arrays bounds-checked, but the OP said they turned that off. Maybe using (ref-counted) classes where structs would have sufficed? Or passing arrays around in a way that defeats the COW optimizations?
* BEGIN RANT There’s a newish Nim feature that lets you declare references as non-null, and uses control-flow analysis to prevent you from assigning/passing a null value, at compile time. Awesome! Only, the analysis was failing in lots of my code and complaining a variable might be null, when I had just checked it right above. After I made some reduced test cases and reported the problem, the BDFL told me (brusquely) that the control flow analysis ignores “return” statements. And that this is not a bug and will not be fixed … because you should be assigning to “result” instead of early-returning (praise Wirth!) This despite “return” being a perfectly cromulent part of the language that’s been around for ages. At this point I decided a single BDFL, esp. a cranky German one [I’m from a German family myself, I know the type] is probably a misfeature and I should look elsewhere. END RANT
This must be what he means – but it comes up all the time. There must be a dozen github issues/RFCs/forum threads where it is referenced. Araq is so strongly opinionated about this one that it is surely a “backing” example to his line in the Zen of Nim: If it’s hard for the compiler to reason about then it’s hard for people to and so you should not do it.
While I love Nim, I disagree on this line item. I think people can be bad at reasoning in situations where computers can be good, and this can matter to program notation - like closing parentheses in Lisp vs. indentation where humans are just bad at “base-1/unary” after about 3..4 things (hence the 5th diagonal slash after ||||, as another example). Even adamant Lispers will say “Let your editor indent/do that for you!” - a paraphrase of saying “a program is better at counting” - they just recommend a different program from the compiler. ISTM, early return is a similar case (but more in the technical weeds).
If you want to tinker, this looks to be the Swift source. I reproduced the 1.5 hour running time estimate by commenting out line 364’s call to verbose printing function output_state_choice. Commit history suggests this was left out during the test. Despite some references to the number of cores in code, I found it used just one core, though I don’t know how the other implementations behaved. Memory grows steadily for the at least the first minute or two, so you could be onto something with copy-on-write behavior.
For me, the Nim hits ~590% utilization (if 1core=100%). I boosted NUM_THREADS from 6 to 16 on a 16-core and that util didn’t change. So, making the work partitioning more fine-grained could maybe yield a 10x better time on a 64-core AMD 3990X – depending upon contention, of course. { Famous last words, I know! :-) }
Having and maintaining your own private language seems like a bad idea. And unless you have a LOT of free time and some very good ideas, trying to attract supporters away from an already-niche language seems like a bad idea, too.
I disagree. If one or two central people that maintains an open source project are not easy to cooperate with, then it can be very fruitful over time if someone forks it.
Also, forking a project does not necessarily mean that a single maintainer needs to do all the work. Receiving pull requests does not need to be that time consuming.
In addition to this, some forks can be maintenence/“LTS” projects, they don’t have to keep the same pace of development to be useful. Sometimes a few selected patches going in can mean more to users than a continous stream of features.
You’re welcome to D. The language is awesome. However, such “BDFL” have teratons of focused complaints to answer about so it’s not necessarily a good idea to escalate the problem on the internet instead of being patient and help fix it.
You’re dropping the whole language because of one extremely niche feature 99.9% of developers would never stumble on? You know it’s not the language that looks bad in this story right?
I don’t think you understand the feature he’s complaining about correctly, because it seems to me to be very common, as attested by @cblake’s comment that “it comes up all the time”.
There’s cross-talk here. The specific A) strictNotNil feature supporting early return is (likely) a small niche, and B) early return/structure in general is much bigger (& what I meant by occurring regularly, e.g. here). Lest quick/casual readers be confused, early return/break are absolutely accepted {“cromulent” :-) } parts of Nim - not going away. Araq pushed back on hard work for A) he feels other PLs skip while being overwhelmed to get Nim2 in shape (& did not write a misleading doc in dispute according to git).
@snej’s earlier comment (& that Forum thread) indicate he was ok with incompleteness & improving docs. Dropping Nim was more related to a “cranky single German BDFL” - a feature of a community, not a programming language. (I agree “completely and objectively wrong” was needlessly inflammatory rhetoric, but “put in footguns” is also.) Anyway, like @agent281 I am also sorry to see him leave!
These disagreements are often about division of labor & perspective taking, not “objectivity” (esp. given the current state of human psychology as a science). To connect to my prior example, Lispers also complain about offside rules making macro writing “objectively” harder at a cost of “objectively less readable” syntax. Both compiler writers & language users understandably guard their time, driving much emotion.
I honestly never went back to look at that thread after my last reply. I probably won’t.
Maybe I’ll try Nim again sometime. I turned to Swift on the rebound and wrote a few thousand lines over the holidays (not my first waltz with that language) and largely enjoyed it except for the incredibly awkward APIs for working with unsafe constructs. (I wasn’t bridging to C, just marshaling data for a binary network protocol.) Which is to say I’m not totally happy with Swift. I’m still waiting for the perfect language.
It was the straw that broke the camel’s back. And where did you get the idea that null checking or the “return” statement are extremely niche features? Ok, null checking isn’t on by default in Nim, but it’s basically table stakes in most new languages ever since Hoare’s “billion dollar mistake” went viral.
All floating point formats break scale independence. Fixed point formats solve this problem, but have serious limitations for general computing, due to limited range and loss of precision. Based on the tradeoffs, floating point is justifiably more popular and more widely used.
All floating point formats break scale independence. Fixed point formats solve this problem
I don’t think this is true. I take scale independence to mean that if you scale everything by the same amount, true statements remain true. Floating point clearly doesn’t quite have this, since the relative error depends on one’s proximity to the next power of two, but it’s pretty close.
Meanwhile fixed point doesn’t even make an attempt. I love fixed point, and it does have many virtues, but scale independence ain’t one.
With the exception of denormals ieee754 does have scale independence though - you always have 53,63, or 22(or some such) bits of precision, regardless of magnitude
Hare will not replace anything, just like Rust didn’t replace anything, and just like any other language created with the goal of replacing a language ever replaced anything.
I think the word “replace” here doesn’t mean necessarily mean complete and total replacement of one thing with another. That has… probably never happened with any language(?)
What I think it means in this context is that someone may choose to use it over some other existing language, so in a way it has “replaced” the older language in that particular instance. In other words, if I am writing a new tool, hare might “replace” C or something else in my list of possible languages to write it in.
I think that’s the right definition, and a newer language might “replace” another in proportion of how many projects pick it instead of the older language. As a concrete example, Zig replaced C in the ncdu project.
The only commercial product I’m aware of that used BCPL is TriPOS, which was ported to the Amiga to become AmigaDOS (the filesystem portion of the operating system).
Once you’re willing to accept a global GC, you open up a much larger space of languages that you might use. .NET languages can be AOT compiled and statically linked with Mono (not sure what the status of this is in the MS implementations, it seems to come and go), Go supports only statically linking.
If you are happy with lower performance then embedded Lua or JavaScript implementations exist and with something like Sol3 it’s easy to expose a few performance-critical bits from C++ and write the application logic in Lua.
I had a look at D a while ago and it seems like it’s a better C++ than C++98 but it wasn’t clear that it was a better C++ than C++11.
I’ve used C++ professionally six years and D six too (have known both for the last 12 years..) and there is no contest, D is much simpler, more effective, and has much lower mental friction. I’m struggling to think about the kind of projects where I’d use C++, perhaps with some exotic hardware.
I think that’s the kind of dichotomy that I have a problem with for D. If I’m in a scenario where I need to be able to escape from the type system and write low-level code, for example writing a memory allocator, a scheduler, or a language runtime, I definitely don’t want a language with a global GC. If I don’t have these constraints, I have the option of languages like TypeScript, F# / C#, Go, Lua, and so on. From what I’ve read about D, I don’t think I’d be more productive in D than in TypeScript or C# (especially factoring in the ecosystems of those languages).
Well if you don’t need native, then of course you have more choices. In my field, it’s all native languages.
If I’m in a scenario where I need to be able to escape from the type system and write low-level code, for example writing a memory allocator, a scheduler, or a language runtime
Every once in a whlie, I take a look again at D for my personal projects. It however always comes down to three problems:
I deal with quite a bit of XML. The state of std.xml has not changed in years: deprecated for poor quality, but no alternative in the standard library.
I want to use a library for something, and it turns out to be a C++ library. Using C++ libraries is not really doable from within D.
No promise for compatibility like Go has and C++ provides with the options to force any given version of the ISO standard. My personal projects tend to grow slowly, and I do not want to risk they will fall to language incompatibilities all of a sudden.
I worked on a D project 6 years ago in a large company. I loved the language, and enjoyed working with it. I still want to love it. Every time I used it, it felt like a much faster Python, with the power of C++, but without the hassle of C/C++. The whole argument around GC that floated around that time was valid for Games and other low-latency applications, but otherwise was a bit misguided and unfair to D. I felt positioning itself as a C++ alternative didn’t work well (due to GC), while it should have been positioned as Go/Python/etc alternative.
In the end, I gave up on D. Why? The ecosystem. The language features and design didn’t make up for the lack of a strong ecosystem. There were was just a bit too much a lack of libraries (e.g I had to develop my own Fuse wrapper), lack of editor integration, compiler (i had so many dmd crashes…) and library bugs (std.json used to be famously slow), that eventually I gave up and went back to Python and the likes. Nowadays, Rust covers most of what I need, albeit still feels slower to develop than in D.
std.json is still slow, but std_data_json (https://github.com/dlang-community/std_data_json) is excellent. (It’s a stream parser. Can confirm that it rocks- low GC use, high throughput, though we’re using a fork.) And I like to think that critical compiler bugs have gotten a bit under control, though there’s lots remaining.
I started with D in late 2006 and it pretty quickly replaced all other programming languages (except when work required something else, which was on-and-off). It does have a big strength in familiarity with the same C-like syntax, but letting you expand in various directions incrementally. I replaced PHP for webdev with D just as easily as I replaced C with it and that’s super cool.
I agree with the author on the skill gap though; the D leadership don’t really know how to manage or market, and I also feel like they just kinda chase trends instead of having their own vision. But this isn’t necessarily a bad thing - if they fail at marketing, that doesn’t affect me since I’m already here :) And chasing trends is frustrating in the short term, but long term it actually does help with its all-purpose character as the essence of the strong ideas survive while the other stuff just kinda gets forgotten. All the while, the traditional core stays pretty strong.
Other people complain about libraries and garbage collection, but I see the GC as a major win (and D’s implementation is flexible enough for those cases where it doesn’t do as well), and I prefer to DIY so I ignore libraries anyway. So both features, not bugs :)
Of course I’ve been using it so long now that I’ve built up my own decent little library system and just instinctively know how to avoid things that don’t work so well your mileage is likely to vary.
I also feel like they just kinda chase trends instead of having their own vision.
This is probably my second biggest gripe with D. Because they don’t have their own vision, or perhaps because they keep changing it, everything in D feels bolted on. Every time I start a function with @safe, or something similar, it feels like a janky afterthought.
My biggest problem with D is the marketing. They need to stop pretending D is going to, is trying to, or even wants to, replace C or C++. I’m sorry Walter, but that ship sailed a decade or more ago. No matter how much people say it, anytime someone who writes C sees that D has GC, they will immediately rule it out. And the number of devs who won’t are negligible. Instead D should focus on what everyone says who tries it. “Wow! This is like a fast python.”
D is a great language. Its fast, it feels natural to code in as a former VB, C#, and current python dev , and the development experience, IMO, is top notch. dub is really nice and puts a lot of languages (looking at you python) to shame.
No matter how much people say it, anytime someone who writes C sees that D has GC, they will immediately rule it out.
This does happen of course for decision-maker or high-level people, but D tend to attract “bottom-up” types that like to check facts for themselves (for better or worse). The most common path is that gradually people open up to the GC being a good idea as discovered by academia decades ago.
I’m not going to argue with your claims, but the size of D community would suggest the number of “bottom-up” types attracted to D is minimal compared to other languages stealing mind-share from C/C++.
Personally I care about whether I can do my job efficiently, correctly, have a fast result, a nice community etc. Does it build fast? Does it give me useless work day to day? I can verify these kind of low-level property.
I can also verify that the community is growing http://erdani.com/d/downloads.daily.png
So I’m not sure what problem D has really. practice vs theory
Your post is the exact response D enthusiasts always give, and IMO, it’s the wrong one. To quote myself, no one is going to switch to D after learning the “GC isn’t that bad” or the “GC can be tuned” or the “GC can be turned off”. They see GC and they leave.
D needs to stop fighting with people who care about GC in their language and instead embrace the people who are okay with it. D is never going to convert people from C, C++, or rust. They have a real chance in my opinion of converting people who program in ruby or python or elixr though. The kinds of languages that D could actually compete with and win in a lot of categories.
Yeah, I’m inclined to agree. It bugs me to see people get the technical facts wrong but the fact is correcting them is just going down a distracted path. It is implicitly ceding that GC was a mistake and you should avoid it and that’s the wrong message.
I like to call D an “all-purpose language”. It can do just about anything you want it to do, some things with more or less work than others, but virtually anything is possible. I’d rather emphasize that fact overall while keeping the focus on core strengths rather than letting online critics change the tone. We’re hurting ourselves by focusing on niche uses for new people. If you’re already a D user and want to apply your beloved language to the new niche, absolutely, let’s get into the details of when, where, and how to tweak/remove the GC or whatever else, I’m sure you’ll be pleased, maybe even elated that you don’t have to switch languages to explore this new area. We get 80% of the benefit with 20% of the pain thanks to still being the same language you know and love.
But if you’re not already a D user and specifically focused on this niche area? You’re not going to see those benefits. You still get the pain of using a new language, add on the pain of using it an esoteric manner so there’s not as much established lore out there to help you… and for all that pain, you only get an 80% solution, when the other languages are promising 99%. As D marketers, we’re just setting everyone up for disappointment.
When I switched to D, the two main things I started with, like I said in the other comment, were actually websites from PHP and little homemade games (and other assorted desktop app stuff) that I happened to write in C++ at the time, but could just as well have been done in just about any other language. (And actually, one of the things that put me over the edge was spending half a day trying to trace down a memory management issue in the C++. Turns out the memory was freed by the library but that wasn’t obvious… and I just felt it was wasted time figuring it out when a GC would have avoided the whole question. C++ has improved since then too, nowadays there’s better techniques to make it more obvious, but still, like why even bother messing with it in most cases?)
So if I was the “D”ictator, I’d probably focus on some relatively boring, mainstream thing and compete against Python like you said, or maybe Java (D started life as basically a Java/C hybrid!), just with the tantalizing possibility of it being the last language you ever need to learn since it will grow and adapt with you as you expand your horizons. That’s the way I see it.
But since I actually just use D every day instead of commenting on the internet (I say, in a comment on the Internet), I’m apparently not the target market. Alas.
What libraries are you using for web development in D? I would love to give it a shot and just dabble around in some REST backends that don’t require much beyond session management and database connection (preferable to Postgres).
What libraries are you using for web development in D?
Wrote it all myself, lives in here: https://github.com/adamdruppe/arsd specifically the module cgi.d in there does the web serving (it also includes a http server if that’s your thing) and then database.d is a generic interface and postgres.d is a libpq implementation of that interface. You can ignore the other files in there if you don’t need them.
cgi.d embeds its own little key/value store you can use for sessions on Linux - I never implemented these add-on features on Windows though (the core system works fine but the included session thing uses a linux-only addon server right now). And I have barely documented this at all since, again, I wrote it myself for myself and I just copy/paste out of my last project to get a new one started.
But it looks something like this
struct MySessionData {
int userId;
// and whatever
}
void myRequestHandler(Cgi cgi) {
auto session = cgi.getSessionObject!MySessionData;
// now normal assignment saves and loads from the other server
}
mixin GenericMain!myRequestHandler; // simplest way to get it going
The cgi object has request/response methods see: http://dpldocs.info/arsd.cgi Basic functionality there actually based on PHP, then there’s far more advanced stuff on top like you can have it automatically wrap classes, generate HTML forms from parameters and html/json from return values. But I’ve documented that even less than the php-style basics. One of my public projects uses it: https://github.com/adamdruppe/dupdates/blob/master/main.d#L157
And database is basically all
auto db = new PostgreSql("dbname=whatever");
foreach(row; db.query("select * from whatever where id = ?", x))
row["id"] or row[0] works. they are strings, you can convert to whatever else if needed
again my goal was to migrate PHP when I originally wrote this and I found it good enough so happy.
Aaaaanyway, like my stuff works for me, but no promises it will work for you especially given my partial (at best) documentation.
The popular D web lib (they have much better marketing and more docs, bigger community around it) is this one: https://vibed.org/ but by the time that came out mine was already working for me for years so I don’t know a whole lot about it.
Curious about why C is considered less harmful than D. My experiences with D were all pretty pleasant. In another thread someone mentioned how D takes a “kitchen sink” approach to programming, which I could see. Then there’s also the fact that it’s unsafe by default. So being a larger unsafe language, maybe one could argue that it’s like C but it gives you more rope?
It is just unsubstantiated nonsense, it doesn’t even try to provide justifications and has a lot of very iffy items.
But if I wanted to defend that position for argument’s sake, I might say C is more of a known quantity than D. Less rope perhaps, but also many of the problems of C are known to a lot of people and there’s processes to mitigate its risk whereas D has more uncertainty, especially in some areas where C has a lot of history like kernel development. (I know Linux Torvalds prefers plain C and I think it is because he finds it easier to follow what is going on; less syntax sugar, less potential for runtime surprises.) The hint at the end of the link saying “complexity” is the problem could just say since D is more complex, it is worse.
I don’t believe that myself but it is the most reasonable argument I can think of. (The counter I like to the complexity thing is much of that is inherent to the problem. Simple languages attacking a complex problem still lead to complex code… it isn’t eliminated, it is just shifted.)
I swear I thought you guys were talking about options A and B which are really C and D but had little to do with programming languages until I hit Torvalds.
There is accidental complexity in D, things that didn’t worked out and complicate the language, but honestly it’s not too bad and all fits in a few blog articles.
I think the author has a small point because (as seen with C++) using a complex tool puts you in the mood to write complex things :) but all in all software exists to make things possible.
This disclosure has been made in good intelligence with Ledger, who talks about how the latest firmware can avoid such attacks (two other attacks are described, and mitigated):
One unexpected challenge of CSV is the lack of explicit encoding. I think it’s common knowledge that CSV is often best avoided, but wanted to highlight this painful point.
The most important reason why .csv is so popular is that you can edit info easy in Exel or import data in .csv. About XML I have not checked yet. Thanks for the point :)
If you are interested about D, I wrote d-idioms as a way to quickstart your understanding, starting with the edge cases :) => https://p0nce.github.io/d-idioms/
I have built my company with this language and it’s picking up steam in my local area. Usually people dismiss D immediately when they first heard of it ; I find it nothing short of fascinating, and well usable since the last ten years.
I’d be very interested in chatting about strategies. My company is pivoting towards Rust and while that might feel like competition, breaking the monoculture of C is to everyones gain in that area. (Same goes for, for example, Ada in the embedded space)
I’m not sure why D isn’t more popular. My guess is some teething issues back in the day limited its growth:
std lib split with phobos/tango
D1 vs D2 changes
dmd compiler backend license
lack of a build package manager (eg. pre dub)
From what I understand, these have all since been resolved. It will be interesting to see if D usage picks up, or if those early speed-bumps made too much room for other (newer) languages to pass it by (Go, Rust, Swift).
For me, D is known and perceived just as a “better C++” (Alexandrescu is one the brightest evangelists of D and his C++ past does not help with the language image) and I do not want a better C++. I do not want another deeply imperative programming language that can do some functional programming accidentally. I do not want another language born from the culture of obsessive focus on performance at the expense of everything else.
What I want is a ML-ish language for any level of systems programming (sorry, D and Go: having GC is a non-starter) with safety and correctness without excessive rituals and bondage (like in Ada). Rust fits the bill: it’s explicitly not functional, but has strong safety/correctness culture.
Precisely because of the lack of GC and the focus on lifetimes, Rust is much more similar to (modern) C++ than D will ever be. Writing Rust is like writing correctly written C++.
D, having a GC, leads to different programs (than C++ or Rust) because of this global owner for resources that are only memory. eg: slices have no ownership information in the type system. This makes scripting very friction-less, at the cost of some more problems with non-memory resources. But not at the cost of speed.
D has @safe which is machine-checked, opt-in memory safety.
Thanks for the clarification, indeed I had a slightly wrong impression about the D programming style and ignored the profound influence of garbage collection on the programming style.
Still, everything that I learn about D irks me (native code with GC by default? metaprogramming with native code, without dynamic eval? opt-in @safe?!) and feels too much like the old C++ culture with its insensible defaults.
This is why Go appealed to so many people isn’t it. This is the “new normal”. (Of course, OCaml etc had this before)
dynamic eval
This is probably a bridge too far if you are appealing to people from C++ background (C++ programmers as target audience is bad market strategy for D, IMHO)
Opt-in @safe.
I agree with you on this.
feels too much like the old C++ culture
Well, the two main D architects are old C++ hands after all!
Go – Too limited a language. It has good marketing behind it. Outside of having some libraries that are not available natively on D (or C), there is no good reason to use Go as a D programmer.
Rust – A very strong competitor. Not every program needs Rust’s level of thinking about memory (though some of Rust-exclusive feature do make it attractive over D)
Swift – Nice enough language. Is still an alien on Linux (Windows, what is even that?). D has first class support on both Linux and Windows.
I was more thinking about folks skipping over D in the past, and how that potentially limited its uptake, than from the perspective of a D programmer looking at the current popular trends. Certainly an interesting perspective though. Thanks for sharing!
Indeed all those points have since been resolved. What hasn’t been resolved is that there isn’t a simple message to give as a marketing motto, since D tends to check all the boxes in many areas.
Also, I must be one of the few people who learned C++ and never learned C. “C/C++” has always irked me, as if the two were easily exchangeable, or worse, the same language.
I’m a system programmer. I’m not sure if front-end is any easier, actually it seems to be more dynamic and fast-paced than I could handle.
Agreed with author, I never use auto anywhere.
Lots of small language-level improvements, but there don’t seem to be that many fundamental compiler architecture issues mentioned in the changelog. I may be missing those since I haven’t used Nim for a long time, but I assume if the compiler was made 10x less janky they would’ve at least had a footnote about it. (For context, I’ve used Nim as my main language in 2019-2020.)
I don’t like that there still isn’t my #1 most awaited feature - incremental compilation. Nim isn’t a very fast language to compile and the slowness really showed with some of my projects. In particular rapid, my game engine, suffered from pretty terrible compile times, which made building games with it really difficult.
And I wonder how much generics jank is still present in 2.0. And I wonder when they’ll make generics into proper generics rather than C++-like templates. The error messages with those can get pretty abysmal and can appear out of nowhere. Like here. This particular cryptic error has appeared so many times while I was developing my game that I just gave up on the project eventually.
In addition to that I’m not a fan of how lame the module system is. Having used/seen module systems from other languages, eg. Rust or Go, Nim’s compiler not having a clue about the concept of packages feels pretty ancient. This particularly shows when you’re developing libraries made out of many small modules; the only way to typecheck such a library in its entirety is to create a
library.nim
file and import everything there. Text editors didn’t seem to understand that back in the 1.4 days and would regularly miss type errors that occurred when analyzing the library as a whole.Oh, and the text editor situation… Nim’s compiler does not have incremental recompilation, so the autocomplete gets slower the larger your project is. And it can get really slow for even pretty small projects (~10k SLOC.)
And don’t get me started on the dialects. Endless
--define
switches andexperimental
features. And none of it is implemented in a robust way. Anyone can break your library by globally enabling a--define
you did not anticipate. And the--define
s are not even documented in one place.So sad to see Nim’s leadership pursuing syntax sugar and small convenience features instead of fixing foundational problems. Really wish they had a more forward-looking vision for the project and its community, rather than focusing on fulfilling the main developer’s wishes and experiments.
The Nim leadership is the main developer, Andreas. He’s not interested in sharing responsibility or broadening the leadership, as he vehemently expressed a month ago:
That was the point where I gave up on Nim. I don’t know where to start with this — it’s laughable that he pulls out that silly fight about master/main as his breaking point; he whines about society changing and it’s their fault he might have to “change old habits”, and that tired canard about racism/sexism. (He also appears to have deleted the comments objecting to his post, though not the supportive ones. Because of course he’s the forum moderator too.)
But my main takeaway is that the guy’s even more of an asshat than I previously thought, and he’s going to remain the gatekeeper to any change in Nim, and main source of truth on the forum. I just didn’t want to deal with that anymore. I’d rather fight with a borrow-checker.
I’ve seen his comment, yeah. It’s informative and unfortunate.
I’ve honestly been tempted to write a “why not Nim” blog post for a couple years now but never got around to doing so because a) I don’t like spreading negativity, and b) I’d rather not attract publicity to a project whose success I don’t exactly believe in.
Bad opinions aside, I believe Araq’s lack of care for the community is precisely the reason why the project is going in the wrong direction. I’ve heard horror stories from former compiler contributors about how hard to maintain the code is and how much it lacks documentation. No wonder it doesn’t attract very many outside contributions. Had he cared more for having other people work on the language alongside him, maybe things would have turned out different, but alas…
This sort of dictatorship is not the sort of leadership of a project I’d like to invest my time in. I much prefer the Rust RFC process over this.
Woah, I didn’t expect so much negativity in this thread… I was kind hoping to see some interesting discussions and maybe even some praises for a language that reached its 2.0.0 milestone without the backing of any tech giant.
Sure, the language is probably still not perfect, and at least some of @liquidev’s remarks make sense… but it is a remarkable milestone nonetheless.
I have been using Nim for years mostly on my personal projects (I kinda built my own ecosystem on top of it), and I must say it is fun to use. And it is very well documented. Unfortunately it feels very much a fringe language because it didn’t see massive corporate adoption (yet) but I hope this can change sooner or later.
About Araq: the guy can be rude at times, maybe even borderline unprofessional in some of his replies but he did put a lot of energy into the project over the years, and I am grateful for that. I tend not to get too involved in politics or heated debates… I saw that reply and it struck me as “a bit odd” and definitely not good marketing for the language and the community. I just hope that doesn’t drive too many people away from the language; it would be a pity.
Well it’s one thing to wish for some features, it’s another to wish for a leadership that doesn’t have a personal mental breakdown in the official forums - attacking a multitude of people - and deletes any critical response. The second one can’t just be ignored.
And if rust is already struggling with compile times, I wonder how bad this is with something that doesn’t even know about incremental compilation. You can’t just ignore a debugging round-trip time of minutes.
You can ask people for being less negative or strict, but first: don’t forget it’s v2.0 and second: the other way of not complaining about real production problems is to say nothing and move on.
I’m sorry if my comment came off as rude or overly negative… I don’t mean to ruin the celebration; as a long time user I’m just trying to voice my concerns about the direction the language is taking, and I think it’s important to talk about these rather than keep quiet about them forever and instead create an atmosphere of toxic positivity. 2.0 is considered by many a huge milestone and seeing important issues which put me off from using the language not be addressed in a major version is pretty disappointing.
Perhaps this speaks of our expectations of major versions; I see them as something that should be pretty big while in real life often it’s just some small but breaking changes. I’m of the firm belief that perhaps Nim went into 1.0 too early for its own good, because inevitably there will be breaking changes (and with how unstable the compiler can be, there can be breakages even across minor versions.)
I’ll be this person and ask you why you don’t come to D? There is a fundation and the tone is very respectful. It is a main inspiration for Nim, actually Araq spent many years reading and commenting on the D forums. D pioneered many things that went into Nim, but the core language is very stable and there is no compiler switch explosion. In many ways D is more further along than Nim with its 3 compilers and it supports internal controversy and I’d say sane debate inside its community. I do see a bit of FUD about D on the internet, often by echo’ing a single negative opinion in a majority of content programmers. Sometimes I think it’s down to syntax (Python-like vs C-like).
Agree. I also use D and have since… looks at personal repos… 2014 or 2015 but maybe earlier and started doing some toys in Nim around 2018. What D lacks is buzz. It’s mature, stable, and performant and, at least for me, doesn’t break between upgrades. Some corners of D like CTE and nested templates I find hard to debug (and this is true for other languages, but that’s not a free pass) but they work. I keep finding bits of Nim and core/former-core libraries where that’s not the case and they fail in odd ways and that’s still true in 2.0.
I actually have a book on D that I got years ago. I’d forgotten about it.
Is the compiler still proprietary?
DMD backend was the only bit proprietary and that’s not the case anymore since years.
After seeing the Rust community extensively argue about the gender of philosophers in a silly analogy, I’m glad that Nim has a leader who is explicitly against such bullshit.
As bizarre as that is, Araq’s use of the phrase “clown world” is more indicative of future behaviour than random Rust community members talking about pronouns. Here’s another strange Araq post - I wouldn’t want to support a project with this kind of world view.
Look carefully at the date of the post you linked…
April Fool’s was an opportunity to make a joke, but the content of the so-called joke is all Araq.
Maybe also because that analogy argument was inside one issue, opened specifically to bikeshed it. The other one felt more like a dismissal of anything that isn’t in his view of the world - in a discussion about giving the community a chance to steer the direction of the language.
I’d happily take that over Araq’s bullshit, like when I pointed out that Nim’s null-checking was leading to bogus errors in a bunch of my code (after hours of debugging and creating a reduced test case) he dismissed it with “that’s because the control flow analysis doesn’t notice ‘return’ statements, and you shouldn’t be using return because it isn’t in Pascal.” Despite him having put both features in the language.
All else aside, I think there’s truth in this statement.
With enough sophistry any statement can be considered true.
Oh? I recall similar arguments being used against Jews.
It’s a fairly obvious logic fallacy, which anyone smart enough to be a programmer ought to see through pretty easily. (Hint: if you deny
a > b
, it does not follow you believeb > a
.)Although I agree with almost all of your points and came to the same conclusion, I think it’s fair to say that not all critical comments were deleted. There are several in the thread that you linked.
The comments do show that at least one comment was removed. I don’t know if there were more now-removed comments because I read the thread only a while after it was closed.
After trying Nim for a little while some time ago, the module system is what soured me on the language. I don’t like that you can’t tell where a symbol comes from by looking at the source of the current file. It’s “clever” in that you automatically get things like the right
$
function for whatever types get imported by your imports, but that’s less valuable than explicitness.On the contrary, I actually don’t hold much grudge against the import system’s use of the global namespace. Static typing and procedure overloading ensures you get to call the procedure you wanted, and I’ve rarely had any ambiguity problems (and then the compiler gives you an error which you can resolve by explicitly qualifying the symbol.) While coding Rust, I almost never look at where a symbol comes from because I have the IDE for browsing code and can Ctrl+Click on the relevant thing to look at its sources.
My main grudge is that the module system has no concept of a package or main file, which would hugely simplify the logic that’s needed to discover a library’s main source file and run
nim check
on it. Right now text editors need to employ heuristics to discover which file should benim check
’d, which is arguably not ideal in a world where developers typically intend just a single main file.I’ve kinda done something like this over the years, only less purposefully. And I thought I’d settled on Nim, but I rage-quit it a few months ago after a particularly egregious “it’s my language and I can put in any footguns I want*” decree by the Benevolent Dictator-For-Life on the forum.
Anyway for now I’ve ended up at Swift. (And yet I keep coding in C++ because Reasons. Sigh.)
I wonder why the Swift version is so slow. Arithmetic is overflow-checked by default, and arrays bounds-checked, but the OP said they turned that off. Maybe using (ref-counted) classes where structs would have sufficed? Or passing arrays around in a way that defeats the COW optimizations?
* BEGIN RANT There’s a newish Nim feature that lets you declare references as non-null, and uses control-flow analysis to prevent you from assigning/passing a null value, at compile time. Awesome! Only, the analysis was failing in lots of my code and complaining a variable might be null, when I had just checked it right above. After I made some reduced test cases and reported the problem, the BDFL told me (brusquely) that the control flow analysis ignores “return” statements. And that this is not a bug and will not be fixed … because you should be assigning to “result” instead of early-returning (praise Wirth!) This despite “return” being a perfectly cromulent part of the language that’s been around for ages. At this point I decided a single BDFL, esp. a cranky German one [I’m from a German family myself, I know the type] is probably a misfeature and I should look elsewhere. END RANT
Do you have a link to the forum discussion?
This must be what he means – but it comes up all the time. There must be a dozen github issues/RFCs/forum threads where it is referenced. Araq is so strongly opinionated about this one that it is surely a “backing” example to his line in the Zen of Nim: If it’s hard for the compiler to reason about then it’s hard for people to and so you should not do it.
While I love Nim, I disagree on this line item. I think people can be bad at reasoning in situations where computers can be good, and this can matter to program notation - like closing parentheses in Lisp vs. indentation where humans are just bad at “base-1/unary” after about 3..4 things (hence the 5th diagonal slash after ||||, as another example). Even adamant Lispers will say “Let your editor indent/do that for you!” - a paraphrase of saying “a program is better at counting” - they just recommend a different program from the compiler. ISTM, early return is a similar case (but more in the technical weeds).
That is really discouraging for a language I’ve had a lot of faith in. Thanks for sharing.
I’m sorry to hear that you fell out of love with Nim. I always enjoyed hearing your perspective on the language.
If you want to tinker, this looks to be the Swift source. I reproduced the 1.5 hour running time estimate by commenting out line 364’s call to verbose printing function
output_state_choice
. Commit history suggests this was left out during the test. Despite some references to the number of cores in code, I found it used just one core, though I don’t know how the other implementations behaved. Memory grows steadily for the at least the first minute or two, so you could be onto something with copy-on-write behavior.For me, the Nim hits ~590% utilization (if 1core=100%). I boosted NUM_THREADS from 6 to 16 on a 16-core and that util didn’t change. So, making the work partitioning more fine-grained could maybe yield a 10x better time on a 64-core AMD 3990X – depending upon contention, of course. { Famous last words, I know! :-) }
The beauty of open source is that Nim can be forked.
Having and maintaining your own private language seems like a bad idea. And unless you have a LOT of free time and some very good ideas, trying to attract supporters away from an already-niche language seems like a bad idea, too.
I disagree. If one or two central people that maintains an open source project are not easy to cooperate with, then it can be very fruitful over time if someone forks it.
Also, forking a project does not necessarily mean that a single maintainer needs to do all the work. Receiving pull requests does not need to be that time consuming.
In addition to this, some forks can be maintenence/“LTS” projects, they don’t have to keep the same pace of development to be useful. Sometimes a few selected patches going in can mean more to users than a continous stream of features.
You’re welcome to D. The language is awesome. However, such “BDFL” have teratons of focused complaints to answer about so it’s not necessarily a good idea to escalate the problem on the internet instead of being patient and help fix it.
You’re dropping the whole language because of one extremely niche feature 99.9% of developers would never stumble on? You know it’s not the language that looks bad in this story right?
I don’t think you understand the feature he’s complaining about correctly, because it seems to me to be very common, as attested by @cblake’s comment that “it comes up all the time”.
There’s cross-talk here. The specific A) strictNotNil feature supporting early return is (likely) a small niche, and B) early return/structure in general is much bigger (& what I meant by occurring regularly, e.g. here). Lest quick/casual readers be confused, early return/break are absolutely accepted {“cromulent” :-) } parts of Nim - not going away. Araq pushed back on hard work for A) he feels other PLs skip while being overwhelmed to get Nim2 in shape (& did not write a misleading doc in dispute according to
git
).@snej’s earlier comment (& that Forum thread) indicate he was ok with incompleteness & improving docs. Dropping Nim was more related to a “cranky single German BDFL” - a feature of a community, not a programming language. (I agree “completely and objectively wrong” was needlessly inflammatory rhetoric, but “put in footguns” is also.) Anyway, like @agent281 I am also sorry to see him leave!
These disagreements are often about division of labor & perspective taking, not “objectivity” (esp. given the current state of human psychology as a science). To connect to my prior example, Lispers also complain about offside rules making macro writing “objectively” harder at a cost of “objectively less readable” syntax. Both compiler writers & language users understandably guard their time, driving much emotion.
I honestly never went back to look at that thread after my last reply. I probably won’t.
Maybe I’ll try Nim again sometime. I turned to Swift on the rebound and wrote a few thousand lines over the holidays (not my first waltz with that language) and largely enjoyed it except for the incredibly awkward APIs for working with unsafe constructs. (I wasn’t bridging to C, just marshaling data for a binary network protocol.) Which is to say I’m not totally happy with Swift. I’m still waiting for the perfect language.
It was the straw that broke the camel’s back. And where did you get the idea that null checking or the “return” statement are extremely niche features? Ok, null checking isn’t on by default in Nim, but it’s basically table stakes in most new languages ever since Hoare’s “billion dollar mistake” went viral.
posit breaks scale-independence, by having number near 1.0 have more precision that number further from 1.0.
All floating point formats break scale independence. Fixed point formats solve this problem, but have serious limitations for general computing, due to limited range and loss of precision. Based on the tradeoffs, floating point is justifiably more popular and more widely used.
I don’t think this is true. I take scale independence to mean that if you scale everything by the same amount, true statements remain true. Floating point clearly doesn’t quite have this, since the relative error depends on one’s proximity to the next power of two, but it’s pretty close.
Meanwhile fixed point doesn’t even make an attempt. I love fixed point, and it does have many virtues, but scale independence ain’t one.
Why do you think IEEE 754 had scale independence? Obviously it didn’t
With the exception of denormals ieee754 does have scale independence though - you always have 53,63, or 22(or some such) bits of precision, regardless of magnitude
Hare will not replace anything, just like Rust didn’t replace anything, and just like any other language created with the goal of replacing a language ever replaced anything.
I think the word “replace” here doesn’t mean necessarily mean complete and total replacement of one thing with another. That has… probably never happened with any language(?)
What I think it means in this context is that someone may choose to use it over some other existing language, so in a way it has “replaced” the older language in that particular instance. In other words, if I am writing a new tool, hare might “replace” C or something else in my list of possible languages to write it in.
I think that’s the right definition, and a newer language might “replace” another in proportion of how many projects pick it instead of the older language. As a concrete example, Zig replaced C in the ncdu project.
I don’t think replacing C/C++ was a goal of Rust. From original presentation (http://venge.net/graydon/talks/intro-talk-2.pdf)
C has replaced BCPL
C++ largely replaced Pascal.
BCPL was never widely used.
The only commercial product I’m aware of that used BCPL is TriPOS, which was ported to the Amiga to become AmigaDOS (the filesystem portion of the operating system).
Is the 2010 “D Programming Language” book from Alexandrescu still largely relevant or is now mostly obsolete?
Consider https://p0nce.github.io/d-idioms/#Which-book-should-I-read?
Once you’re willing to accept a global GC, you open up a much larger space of languages that you might use. .NET languages can be AOT compiled and statically linked with Mono (not sure what the status of this is in the MS implementations, it seems to come and go), Go supports only statically linking.
If you are happy with lower performance then embedded Lua or JavaScript implementations exist and with something like Sol3 it’s easy to expose a few performance-critical bits from C++ and write the application logic in Lua.
I had a look at D a while ago and it seems like it’s a better C++ than C++98 but it wasn’t clear that it was a better C++ than C++11.
I’ve used C++ professionally six years and D six too (have known both for the last 12 years..) and there is no contest, D is much simpler, more effective, and has much lower mental friction. I’m struggling to think about the kind of projects where I’d use C++, perhaps with some exotic hardware.
I think that’s the kind of dichotomy that I have a problem with for D. If I’m in a scenario where I need to be able to escape from the type system and write low-level code, for example writing a memory allocator, a scheduler, or a language runtime, I definitely don’t want a language with a global GC. If I don’t have these constraints, I have the option of languages like TypeScript, F# / C#, Go, Lua, and so on. From what I’ve read about D, I don’t think I’d be more productive in D than in TypeScript or C# (especially factoring in the ecosystems of those languages).
Well if you don’t need native, then of course you have more choices. In my field, it’s all native languages.
D can do all of those.
there is also AoT compilation for java
Both gccgo and the Go compiler from Google supports dynamic linking, and several other build modes:
Every once in a whlie, I take a look again at D for my personal projects. It however always comes down to three problems:
I deal with quite a bit of XML. The state of std.xml has not changed in years: deprecated for poor quality, but no alternative in the standard library.
I want to use a library for something, and it turns out to be a C++ library. Using C++ libraries is not really doable from within D.
No promise for compatibility like Go has and C++ provides with the options to force any given version of the ISO standard. My personal projects tend to grow slowly, and I do not want to risk they will fall to language incompatibilities all of a sudden.
So, I continue using C++.
For XML my recommendation is to use arsd.dom https://p0nce.github.io/d-idioms/#DIID-#1---Parse-XML-file
I worked on a D project 6 years ago in a large company. I loved the language, and enjoyed working with it. I still want to love it. Every time I used it, it felt like a much faster Python, with the power of C++, but without the hassle of C/C++. The whole argument around GC that floated around that time was valid for Games and other low-latency applications, but otherwise was a bit misguided and unfair to D. I felt positioning itself as a C++ alternative didn’t work well (due to GC), while it should have been positioned as Go/Python/etc alternative.
In the end, I gave up on D. Why? The ecosystem. The language features and design didn’t make up for the lack of a strong ecosystem. There were was just a bit too much a lack of libraries (e.g I had to develop my own Fuse wrapper), lack of editor integration, compiler (i had so many dmd crashes…) and library bugs (std.json used to be famously slow), that eventually I gave up and went back to Python and the likes. Nowadays, Rust covers most of what I need, albeit still feels slower to develop than in D.
std.json is still slow, but std_data_json (https://github.com/dlang-community/std_data_json) is excellent. (It’s a stream parser. Can confirm that it rocks- low GC use, high throughput, though we’re using a fork.) And I like to think that critical compiler bugs have gotten a bit under control, though there’s lots remaining.
I think the ecosystem has improved in 6 years.
I started with D in late 2006 and it pretty quickly replaced all other programming languages (except when work required something else, which was on-and-off). It does have a big strength in familiarity with the same C-like syntax, but letting you expand in various directions incrementally. I replaced PHP for webdev with D just as easily as I replaced C with it and that’s super cool.
I agree with the author on the skill gap though; the D leadership don’t really know how to manage or market, and I also feel like they just kinda chase trends instead of having their own vision. But this isn’t necessarily a bad thing - if they fail at marketing, that doesn’t affect me since I’m already here :) And chasing trends is frustrating in the short term, but long term it actually does help with its all-purpose character as the essence of the strong ideas survive while the other stuff just kinda gets forgotten. All the while, the traditional core stays pretty strong.
Other people complain about libraries and garbage collection, but I see the GC as a major win (and D’s implementation is flexible enough for those cases where it doesn’t do as well), and I prefer to DIY so I ignore libraries anyway. So both features, not bugs :)
Of course I’ve been using it so long now that I’ve built up my own decent little library system and just instinctively know how to avoid things that don’t work so well your mileage is likely to vary.
This is probably my second biggest gripe with D. Because they don’t have their own vision, or perhaps because they keep changing it, everything in D feels bolted on. Every time I start a function with
@safe
, or something similar, it feels like a janky afterthought.My biggest problem with D is the marketing. They need to stop pretending D is going to, is trying to, or even wants to, replace C or C++. I’m sorry Walter, but that ship sailed a decade or more ago. No matter how much people say it, anytime someone who writes C sees that D has GC, they will immediately rule it out. And the number of devs who won’t are negligible. Instead D should focus on what everyone says who tries it. “Wow! This is like a fast python.”
D is a great language. Its fast, it feels natural to code in as a former VB, C#, and current python dev , and the development experience, IMO, is top notch.
dub
is really nice and puts a lot of languages (looking at you python) to shame.This does happen of course for decision-maker or high-level people, but D tend to attract “bottom-up” types that like to check facts for themselves (for better or worse). The most common path is that gradually people open up to the GC being a good idea as discovered by academia decades ago.
I’m not going to argue with your claims, but the size of D community would suggest the number of “bottom-up” types attracted to D is minimal compared to other languages stealing mind-share from C/C++.
Yes but this is still talking about popularity.
Personally I care about whether I can do my job efficiently, correctly, have a fast result, a nice community etc. Does it build fast? Does it give me useless work day to day? I can verify these kind of low-level property. I can also verify that the community is growing http://erdani.com/d/downloads.daily.png So I’m not sure what problem D has really. practice vs theory
D has lots of options when it comes to memory, including the
@nogc
annotation.I mean this in the nicest way possible, but …
Your post is the exact response D enthusiasts always give, and IMO, it’s the wrong one. To quote myself, no one is going to switch to D after learning the “GC isn’t that bad” or the “GC can be tuned” or the “GC can be turned off”. They see GC and they leave.
D needs to stop fighting with people who care about GC in their language and instead embrace the people who are okay with it. D is never going to convert people from C, C++, or rust. They have a real chance in my opinion of converting people who program in ruby or python or elixr though. The kinds of languages that D could actually compete with and win in a lot of categories.
Yeah, I’m inclined to agree. It bugs me to see people get the technical facts wrong but the fact is correcting them is just going down a distracted path. It is implicitly ceding that GC was a mistake and you should avoid it and that’s the wrong message.
I like to call D an “all-purpose language”. It can do just about anything you want it to do, some things with more or less work than others, but virtually anything is possible. I’d rather emphasize that fact overall while keeping the focus on core strengths rather than letting online critics change the tone. We’re hurting ourselves by focusing on niche uses for new people. If you’re already a D user and want to apply your beloved language to the new niche, absolutely, let’s get into the details of when, where, and how to tweak/remove the GC or whatever else, I’m sure you’ll be pleased, maybe even elated that you don’t have to switch languages to explore this new area. We get 80% of the benefit with 20% of the pain thanks to still being the same language you know and love.
But if you’re not already a D user and specifically focused on this niche area? You’re not going to see those benefits. You still get the pain of using a new language, add on the pain of using it an esoteric manner so there’s not as much established lore out there to help you… and for all that pain, you only get an 80% solution, when the other languages are promising 99%. As D marketers, we’re just setting everyone up for disappointment.
When I switched to D, the two main things I started with, like I said in the other comment, were actually websites from PHP and little homemade games (and other assorted desktop app stuff) that I happened to write in C++ at the time, but could just as well have been done in just about any other language. (And actually, one of the things that put me over the edge was spending half a day trying to trace down a memory management issue in the C++. Turns out the memory was freed by the library but that wasn’t obvious… and I just felt it was wasted time figuring it out when a GC would have avoided the whole question. C++ has improved since then too, nowadays there’s better techniques to make it more obvious, but still, like why even bother messing with it in most cases?)
So if I was the “D”ictator, I’d probably focus on some relatively boring, mainstream thing and compete against Python like you said, or maybe Java (D started life as basically a Java/C hybrid!), just with the tantalizing possibility of it being the last language you ever need to learn since it will grow and adapt with you as you expand your horizons. That’s the way I see it.
But since I actually just use D every day instead of commenting on the internet (I say, in a comment on the Internet), I’m apparently not the target market. Alas.
What libraries are you using for web development in D? I would love to give it a shot and just dabble around in some REST backends that don’t require much beyond session management and database connection (preferable to Postgres).
Wrote it all myself, lives in here: https://github.com/adamdruppe/arsd specifically the module
cgi.d
in there does the web serving (it also includes a http server if that’s your thing) and thendatabase.d
is a generic interface andpostgres.d
is a libpq implementation of that interface. You can ignore the other files in there if you don’t need them.cgi.d embeds its own little key/value store you can use for sessions on Linux - I never implemented these add-on features on Windows though (the core system works fine but the included session thing uses a linux-only addon server right now). And I have barely documented this at all since, again, I wrote it myself for myself and I just copy/paste out of my last project to get a new one started.
But it looks something like this
The cgi object has request/response methods see: http://dpldocs.info/arsd.cgi Basic functionality there actually based on PHP, then there’s far more advanced stuff on top like you can have it automatically wrap classes, generate HTML forms from parameters and html/json from return values. But I’ve documented that even less than the php-style basics. One of my public projects uses it: https://github.com/adamdruppe/dupdates/blob/master/main.d#L157
And database is basically all
again my goal was to migrate PHP when I originally wrote this and I found it good enough so happy.
Aaaaanyway, like my stuff works for me, but no promises it will work for you especially given my partial (at best) documentation.
The popular D web lib (they have much better marketing and more docs, bigger community around it) is this one: https://vibed.org/ but by the time that came out mine was already working for me for years so I don’t know a whole lot about it.
Curious about why C is considered less harmful than D. My experiences with D were all pretty pleasant. In another thread someone mentioned how D takes a “kitchen sink” approach to programming, which I could see. Then there’s also the fact that it’s unsafe by default. So being a larger unsafe language, maybe one could argue that it’s like C but it gives you more rope?
It is just unsubstantiated nonsense, it doesn’t even try to provide justifications and has a lot of very iffy items.
But if I wanted to defend that position for argument’s sake, I might say C is more of a known quantity than D. Less rope perhaps, but also many of the problems of C are known to a lot of people and there’s processes to mitigate its risk whereas D has more uncertainty, especially in some areas where C has a lot of history like kernel development. (I know Linux Torvalds prefers plain C and I think it is because he finds it easier to follow what is going on; less syntax sugar, less potential for runtime surprises.) The hint at the end of the link saying “complexity” is the problem could just say since D is more complex, it is worse.
I don’t believe that myself but it is the most reasonable argument I can think of. (The counter I like to the complexity thing is much of that is inherent to the problem. Simple languages attacking a complex problem still lead to complex code… it isn’t eliminated, it is just shifted.)
I swear I thought you guys were talking about options A and B which are really C and D but had little to do with programming languages until I hit Torvalds.
I think it’s because Pike designed Go and the author of the article was a big fan of Pike.
The point of view of Walter Bright (D creator) has always been: https://forum.dlang.org/post/m19mc6$1798$1@digitalmars.com and that investment in a langage is a one-time investment (vs reading code).
There is accidental complexity in D, things that didn’t worked out and complicate the language, but honestly it’s not too bad and all fits in a few blog articles.
I think the author has a small point because (as seen with C++) using a complex tool puts you in the mood to write complex things :) but all in all software exists to make things possible.
In b2c it can help having the same system than the users.
This WIP article explains very well the USP of D:
http://erdani.com/hopl2020-draft.pdf (section 3)
This article highlight the value of “design-by-introspection” applied to a particular problem space: https://blog.thecybershadow.net/2014/03/21/functional-image-…
This talk present a concrete example of such: https://www.youtube.com/watch?v=LIb3L4vKZ7U (mind-bending stuff, but feels quite natural too)
This disclosure has been made in good intelligence with Ledger, who talks about how the latest firmware can avoid such attacks (two other attacks are described, and mitigated):
https://www.ledger.fr/2018/03/20/firmware-1-4-deep-dive-security-fixes/
Personally I’m very satisfied with Discord: the client is lightweight, infinite history, and free.
I think all the gaming stuff will be a major barrier keeping businesses from trying Discord.
webshit is webshit; it’s just slack with a cringier aesthetic
One unexpected challenge of CSV is the lack of explicit encoding. I think it’s common knowledge that CSV is often best avoided, but wanted to highlight this painful point.
You are right. Do you know any alternatives?
XML if you can?
The most important reason why .csv is so popular is that you can edit info easy in Exel or import data in .csv. About XML I have not checked yet. Thanks for the point :)
If you are interested about D, I wrote d-idioms as a way to quickstart your understanding, starting with the edge cases :) => https://p0nce.github.io/d-idioms/
I have built my company with this language and it’s picking up steam in my local area. Usually people dismiss D immediately when they first heard of it ; I find it nothing short of fascinating, and well usable since the last ten years.
I’d be very interested in chatting about strategies. My company is pivoting towards Rust and while that might feel like competition, breaking the monoculture of C is to everyones gain in that area. (Same goes for, for example, Ada in the embedded space)
(I’m based in Berlin)
strategies? What do you mean?
Are you in Germany? For some reason, it seems that Germany is the only place where D has really picked up.
Not very far from Germany :)
I’m not sure why D isn’t more popular. My guess is some teething issues back in the day limited its growth:
From what I understand, these have all since been resolved. It will be interesting to see if D usage picks up, or if those early speed-bumps made too much room for other (newer) languages to pass it by (Go, Rust, Swift).
For me, D is known and perceived just as a “better C++” (Alexandrescu is one the brightest evangelists of D and his C++ past does not help with the language image) and I do not want a better C++. I do not want another deeply imperative programming language that can do some functional programming accidentally. I do not want another language born from the culture of obsessive focus on performance at the expense of everything else.
What I want is a ML-ish language for any level of systems programming (sorry, D and Go: having GC is a non-starter) with safety and correctness without excessive rituals and bondage (like in Ada). Rust fits the bill: it’s explicitly not functional, but has strong safety/correctness culture.
Precisely because of the lack of GC and the focus on lifetimes, Rust is much more similar to (modern) C++ than D will ever be. Writing Rust is like writing correctly written C++.
D, having a GC, leads to different programs (than C++ or Rust) because of this global owner for resources that are only memory. eg: slices have no ownership information in the type system. This makes scripting very friction-less, at the cost of some more problems with non-memory resources. But not at the cost of speed.
D has @safe which is machine-checked, opt-in memory safety.
Thanks for the clarification, indeed I had a slightly wrong impression about the D programming style and ignored the profound influence of garbage collection on the programming style.
Still, everything that I learn about D irks me (native code with GC by default? metaprogramming with native code, without dynamic
eval
? opt-in@safe
?!) and feels too much like the old C++ culture with its insensible defaults.This is why Go appealed to so many people isn’t it. This is the “new normal”. (Of course, OCaml etc had this before)
This is probably a bridge too far if you are appealing to people from C++ background (C++ programmers as target audience is bad market strategy for D, IMHO)
I agree with you on this.
Well, the two main D architects are old C++ hands after all!
I think I addressed at the very top the current biggest obstacles to D adoption. I encounter them often when I try to excitedly discuss D with anyone.
I agree with all those points.
I was more thinking about folks skipping over D in the past, and how that potentially limited its uptake, than from the perspective of a D programmer looking at the current popular trends. Certainly an interesting perspective though. Thanks for sharing!
Indeed all those points have since been resolved. What hasn’t been resolved is that there isn’t a simple message to give as a marketing motto, since D tends to check all the boxes in many areas.
I think it should go back to its roots, which is why I happen to like it: “D: the C++ you always wanted.”
But nowadays there are much more people that were never exposed to C++ in the first place.
Very true!
Also, I must be one of the few people who learned C++ and never learned C. “C/C++” has always irked me, as if the two were easily exchangeable, or worse, the same language.
Working towards the release of my first game on Steam (PC/Mac): https://www.youtube.com/watch?v=cBwMHSlFagg