What kind of bothers me about this series of back and forths is so many of them are about syntax stuff and not about semantics. C has a lot of semantics we shouldn’t copy, the syntax is a less interesting problem, IMO.
I agree this seems pretty petty. For me the reason to be militantly anti-C is that the current culture of unsafe languages for high performance work is what is giving governments and organized crime the keys to every single one of our servers, cars, toasters, web browsers, voting machines etc… We have an obligation as engineers to build things that are not going to turn into weapons against the people who trust us to build the infrastructure that they depend on. Governments are starting to push their limits and nation states are dramatically escalating their malicious use of our machines. Exploits don’t get fixed, they get sold and hoarded and used against activists. It’s reprehensible that we are standing by with our hands in the air saying “I can write safe C, aren’t I macho?” while our infrastructure is being weaponized at a rapidly increasing rate. The security industry is about stunt hacking, feeling like you’re an early 90s hacker, and churning out snake oil as fast as possible. I can’t wait for the eventual fruits of the cyber grand challenge and languages like rust to drive that vacuous industry out of the technology space after they have abused our paranoia for so much blood money with vanishingly few real contributions to our security. They don’t make technology, they make magic totems against boogeymen that don’t work and can’t work. As engineers we significantly shape the armatures upon which politics has to operate within, and we’re making the world a shittier place by not fighting for memory safety in high performance widely deployed infrastructure.
to be fair; the author of the original post is mostly concerned with language usability and ergonomics, thus syntax is actually of some importance.
this is not to say that you’re wrong, C sins badly in both domains, but I don’t think that a focus on syntax is inappropriate, given Eevee’s concerns.
The author of the original post has been thinking about making a language, in my understanding, and so musing about syntax is very useful.
the syntax is a less interesting problem
I want to agree with this on principle, but in practice a bad syntax can make the editing experience suck. I’m pretty fed up with haskell-mode’s poor indentation support, and I hate how in Python you can’t copy/paste code without fiddling with the indentation. And in every non-Lispy language I miss structural editing a-la paredit.
I think as long as the syntax is LL(1) the differences are mostly personal preference.
I’ve always thought we should leave the syntax flexible on a per-user basis while standardizing on the semantics. There could be a default syntax that people can understand in general. The main idea, though, is the syntax is a skin on the semantics like the skins on media players, browsers, etc. They’re pluggable in the editors where the language looks however you wanted it to no matter whose library you used. Some semantics might even be converted if it’s something simple to make an equivalent for. Recursion to iteration is potentially one of those.
So, syntax macros?
It can be that simple. Otherwise IDE support implemented however they choose.
I think there’s a cultural division here that invokes such “boy who cried wolf” headlines and blowing-out of meaningful and good nitpicks that needs approached head-on. And I also don’t think it’s as simple as many parties involved would put it as “old school vs new school”, or even one side being more enlightened or knowledgable than the other.
One of my biggest issues I find consistently putting myself on the new-age end of the argument, is that things that are error-prone for beginners can very often become error-prone for anyone, given a lapse in judgement. Take for example the writer’s response to someone’s comment saying having nearly identical postfix and prefix increment/decrement operators likely causes more off-by-one errors than anything:
I was sure someone was going to pick a quarrel about that ? The only thing I can suggest you to do is to actually go program in C for some years, write some good software, and you will see what I mean.
To me, this elitism is a dual-edged blade, that’s managing to cut the dogmatic with each extreme.
On one hand, there are meaninglessly difficult to learn syntaxes and semantics that impede education within their language. Time wasted dealing with these aspects so tirelessly defended is time that could be spent instead turning a junior developer into someone that understands the systems underneath (allocation/memory management, interacting with the processor in a less indirect way), and when presented side-by-side with tremendously easier paths to take, will result in the language just plain being ignored, and with it the semantics the programmer would otherwise learn by using it.
On the other hand, complex, obtuse, and obscure semantics and syntaxes are huge costs for experienced developers, and unless fairly strict or not easily abused/mistaken, create bugs down the road when a mistake inevitably leaks through the merge process of a project, and at their least worst, are a massive harm to the productivity of those developers, and as such should be very carefully considered and not simply justified with convenience.
Two things people on Twitter pointed out:
a) A blog themed “ H2CO3’s tech rants” calls for stopping to bash something ☺️
b) From the comments: “The only thing I can suggest you to do is to actually go program in C for some years, write some good software, and you will see what I mean.”
Also, I find it bad to put the original post close to bashing: it was an interesting tour through C with a lot of appreciation how the language came to be what it is and how that makes it a bad model to copy.
This said a lot of what I wanted to say before. The whitespace vs. braces thing really got to me - I’ve always lived in a world where whitespace doesn’t matter beyond splitting up tokens. To suggest merging together the “useful for humans” element with the “useful for computers” element just makes me incredibly uneasy.
I realize there’s a difference there, but the issue is related. It’s not like using newlines as a statement ender didn’t exist before C, it did. They purposely chose to use semicolons instead because it offers distinct advantages.
Yes, a semicolon is a clear delineator, and if you want to make a quick-n-dirty lexer that doesn’t care about whitespace or making life as simple as possible on the programmer, it’s a great one that makes sense. But arguing that it’s better to have that sort of delineator compared to understanding newlines or other methods seems naive to me.