1. 38

  2. [Comment removed by author]

    1. 14

      Javascript, haskell and even rust also have a bunch of these ‘features’ that need to be learnt. Its just the nature of the beast, nothing specific to C.

      1. 5

        What are some examples from Haskell and Rust?

        1. 8

          Using Haskell for even moderately complex systems usually requires you to use (and learn) several language extensions that are GHC-specific and can add complexity to the language. It’s not common to see a file with 6-10 language extensions.

          This isn’t necessarily a bad thing. The core language has had a conservative evolution and most of the extensions that you’ll actually use are safe and well-understood. It gives the programmer the ability to customize the language, which is neat. It’s not beginner-friendly, though. This isn’t a major problem for intermediate or advanced Haskell programmers, but it puts people off to the language, especially if no one tells them that they can use :set -XLanguageExtension in ghci to bring the language extension in and explore its effects.

          Rust, like C++, is going to seem impossibly baroque if you’re learning it because you have to (i.e. because you were put on a project that uses it) and don’t understand the reasons why certain decisions were made. It makes explicit a lot of the rules that are implicit in sound C++, and those just take time to learn. If you get into it because you heard that it was like Haskell, you’re going to be disappointed, because it’s designed to be much more low level.

      2. 5

        Yep. C’s just from a different era, where there was much less of a gap between the designer and the user. Stuff like this was par for the course – software and computers in general were more arcane and it was just sort of an accepted fact of life.

        1. 6

          I dug out C’s history in detail. The design specifics of C were done the way they were mostly because (a) author like BCPL that forced programmer to micro-manage everything; (b) they didn’t think their weak hardware could do anything better & it occasionally dictated things. BCPL was actually made due to (b), too. It wasn’t about design or arcana so much as (a) and (b). It then got too popular and fast moving to redo everything as people wanted to add stuff instead of rewrite and fix stuff.

          Revisionist history followed to make nicer justifications for continuing to use that approach which was really made personal and economic reasons on ancient hardware. That simple.

          1. 6

            I would argue, however, that if C had not been such a strong fit for a certain (rather low, compared to what most of us dow) level of abstraction, it wouldn’t have been successful. If C had been less micromanage-y, then the lingua franca for low/mid-level system programming would be some other language from the thousands that we’ve never heard of. Maybe it would be better than C, and maybe not; it’s hard to say.

            1. 6

              Modula-2 and Edison were both safer done on same machine. Easier to parse and easy enough to compile. Just two examples from people who cared about safety in language design.


              Modula-2 was designed for their custom Lilith machine:


              These developments led to the Oberon language family and operating system:


              Also note that LISP 1.5 and Smalltalk-80 were ultra-powerful languages operating on weak hardware. I’m not saying they had to go with them so much as the design space for safety vs efficiency vs power tradeoffs was huge with everyone finding something better than BCPL except the pair that preferred it. ;)

              EDIT: Check your Lobster messages as I put something better in the inbox.

        2. 4

          C was less designed than organically grown over the past 40 years. Even if it was removed, you’re going to need to learn it to be able to read C.

          Once you learn this, its not that big of a deal.

          1. 2

            I think that not having a better macro syntax built into the language is just a byproduct of the fact that C fills a niche between usability and control. Speaking generally, if one were to standardize too many of these ‘shortcuts’, C may become more usable but also might become more bloated and infringe upon access to low level control. I think people want access to some low level features without being forced to use assembly.

            I’m not necessarily saying that this applies to do {...} while (0) (because IMO C should offer a better way to do this), but I think there’s a need to recognize a slippery slope of making higher level/black box things part of a language geared towards granular control.

            1. 4

              I think that not having a better macro syntax built into the language is just a byproduct of the fact that C fills a niche between usability and control.

              The designers had a PDP-11 with tiny memory/CPU, optimized for space/performance, preferred tedious BCPL, and didn’t believe in high-level languages or features like code as data. Combination plus maybe backward compatibility led to the preprocessor hack. It was really that simple. What you’re posting is revisionist although probably unintentional.

              1. [Comment removed by author]

                1. 2

                  I noticed all the people doing the more secure stuff intentionally went with a PDP-11/45. Difference must have been significant. In any case, they could’ve still done basic type-checking and such on the other one. My main counterpoint was that they could do Modula-2-style safety by default with checks turned off where necessary on per module, function, or app basis. All sorts of competing language designers did this. Hard to tell what would’ve been obvious in the past but it seems to me they could’ve seen it and just didn’t care. Personal preference.

                  1. [Comment removed by author]

                    1. 2

                      Thanks for the details! I think you’re right about using earlier model to boost credibility. Due to broken memory, I can remember specifically but I know I read something along those lines in one of the historical papers.

              2. 2

                Not really it was more bolted on from some things that were floating about bell labs at the time, the original language designers had little to do with it.

                To quote dmr

                | Many other changes occurred around 1972-3, but the most important was the introduction of the preprocessor, partly at the urging of Alan Snyder [Snyder 74], but also in recognition of the utility of the the file-inclusion mechanisms available in BCPL and PL/I. Its original version was exceedingly simple, and provided only included files and simple string replacements: #include and #define of parameterless macros. Soon thereafter, it was extended, mostly by Mike Lesk and then by John Reiser, to incorporate macros with arguments and conditional compilation. The preprocessor was originally considered an optional adjunct to the language itself. Indeed, for some years, it was not even invoked unless the source program contained a special signal at its beginning. This attitude persisted, and explains both the incomplete integration of the syntax of the preprocessor with the rest of the language and the imprecision of its description in early reference manuals.

            2. 7

              Like everything else, this was fixed in the non-ANSI C version that comes with Plan 9. Unfortunately, it didn’t catch on. Plan 9 doesn’t use a preprocessor. There are only some every basic #directives supported, like #include, and they are implemented in the compiler, there’s no preprocessor.

              Plan 9 also fixed the awful C standard library and came up with something actually nice to use.

              1. 5

                This is one of my biggest grievances with C. Actually I really like C, but with a bit of tweaking the language could be so much more beautiful. But it just won’t evolve in that direction when backwards compatibility is king. And new standard versions aren’t really cleaning it up, just adding more stuff, and not necessarily stuff I agree with.

                Same problem with POSIX and other standards. We’ve got a bunch of baggage that got in there because it was popular somewhere. Not necessarily becaues it was good.

                After seeing Plan9’s dial(3) & co, I’ve cursed berkeley sockets every time I’ve had to use them.