1. 102
  1.  

    1. 33

      There’s no building things up from basic concepts, just piling feature atop feature until you have an endless pile of one-off special cases.

      I disagree completely with this assessment. In my experience, Ada is very well designed specifically in that it takes the big, monolithic features of other languages and break them down into their constituent parts, so you can choose which portions of those features you want. As a concrete example, I never truly understood object-oriented programming until I learned Ada, which breaks down object-oriented programming into separate features of

      • encapsulation,
      • reuse,
      • inheritance, and
      • dynamic dispatch.

      In Ada, you can opt into each of those things separately, depending on what you need “object-oriented programming” for. This is in big contrast to Java, where you type the keyword “class” and then all of that comes along for the ride.

      I much prefer Ada’s fine-grained control over which language features you use, and it’s a bit sad to see someone look at it and think of it as a random hodge-podge of features. No. They are designed incredibly well to fit together and provide a whole that is much more than the sum of its parts.

      I think that much like Modula-2 there could be a fairly useful, pleasant language somewhere in there waiting to be discovered.

      Indeed, I think that useful, pleaseant language is Ada, once one takes the time to discover and become familiar with it.

      1. 26

        Ada really came out unfairly poorly here. Two points:

        1. “Basic type system: 6.5/10”: Is this a joke, given the author gives Rust a 10/10? Rust is still, comparably, in the stone age, only allowing explicit-bit-width-integers (8, 16, 32, etc.) rather than ranges. Ada has type predicates (static and dynamic) that pretty much leaves you no bounds for type definitions.
        2. “Spatial safety: 1/10”/“Temporal safety: 3/10”: This is very unfair, given Ada allows you to prove so many things at compile-time, even more so when you’re using SPARK, the safe subset. Program correctness doesn’t end with correct memory management, and from what I can tell Rust doesn’t offer much beyond that, whereas Ada does.

        I do agree with the author’s criticism of Ada’s packages/standard library, though, but projects like Alire (Cargo for Ada, basically) bring a breath of fresh wind here.

        In the end, though, I see this survey as an opinion piece, which has merits on its own.

        1. 8

          Program correctness doesn’t end with correct memory management, and from what I can tell Rust doesn’t offer much beyond that, whereas Ada does.

          People use Rust’s type system to enforce all sorts of correctness, so this isn’t accurate. The standard library prevents data races. Fuschisa’s netstack3 uses the type system to prevent mutex deadlocks. Encoding state machines into the type system to ensure only correct operations and transitions can be done for any given state is supremely common. Can you do that sort of stuff in Ada?

          1. 12

            Yeah, and even more. You can even prove the absence of data races and runtime errors at compile time, given Ada/SPARK has a very strong contracts system and prover. While Rust has many things going for it, there is just no contest in regard to the type system.

            Not only can you specify integer types with explicit ranges (which I’ve learned to love when programming VHDL, which is heavily inspired by Ada, as it prevents a lot of bugs following the “make illegal states unrepresentable” principle), you can also specify arbitrary static and dynamic type predicates. My favourite example is that you can define a type that holds a prime number as

            type Prime is new Positive with
              Dynamic_Predicate => (for all Divisor in 2 .. Prime / 2 => Prime mod Divisor /= 0);
            

            To give another example, with function contracts you embed function pre- and postconditions and side effects in the function’s type, allowing strict proving despite opaque interfaces. To give an example, if you implement a stack and a pop() function in particular, you usually check if the stack is empty at first. However, in Ada you can simply define a precondition expecting a non-empty stack. Going further, you can then actually prove that this precondition is never violated. Once that proof has been made, one can actually remove all these checks, making it theoretically faster than other languages that do the precondition checks regardless.

            If you ask me, this is unprecedented for any systems language, and in my opinion Rust would be pretty much perfect with a stronger type and contract system. I get where you are coming from, but I really think critique is due here towards Rust, and I think such a type system should be part of any modern programming language.

            1. 9

              Do you have any concrete examples? I tried finding examples of statically preventing race conditions and deadlocks in Ada or SPARK, but Google failed me. All I found were references to Protected Objects (which I gather use locks to prevent concurrent access) and one LWN comment that claimed that SPARK is not capable of preventing deadlocks. Oh, and a couple of people who said that it’s easy to never have issues if you just never use pointers :)

              (SPARK, of course, has adopted borrow checking from Rust to make working with pointers safer, though I think it unfortunately didn’t make it into Ada itself?)

              If you ask me, this is unprecedented for any systems language, and in my opinion Rust would be pretty much perfect with a stronger type and contract system. I get where you are coming from, but I really think critique is due here towards Rust, and I think such a type system should be part of any modern programming language.

              The goods news is that a lot of people are working on formal verification for Rust. There’s no agreed upon syntax for contracts yet, but one is being created, so hopefully the dozen of so formal verification tools currently in development will all support it once it’s ready. The current work to formally verify the Rust standard library will hopefully accelerate adoption of contracts and formal verification in the Rust community.

              My favourite example is that you can define a type that holds a prime number as

              You’ve given this example in the past, and honestly I don’t really like it. With such a potentially slow predicate, you’re probably going to turn the checking off in production, and you’re probably not proving statically that the predicate will always hold.

              1. 5

                I wrote a post a while back comparing Ada/SPARK to ATS that you might find useful. It uses an example from an Ada book using preconditions. While ATS isn’t rust it does show Ada’s imperative approach to a typed ML language.

          2. 6

            For #2, are… are you looking at the right language?

            1. 5

              You are completely right, what a stupid mistake of mine. Then I retract #2, sorry for the confusion.

              1. 2

                All good, thanks. :-)

          3. 20

            Clarifying some Ada here, if you want a language break down from a non-Adaist (?) perspective, there’s a talk on that.

            some of the pointer lifetime shenanigans described below probably require a nontrivial amount of bookkeeping though, and will result in more restrictive designs to get around the limitations on pointer lifetimes and aliasing.

            This is done at compile time, is called “accessibility” and is known in the standard as the “Heart of Darkness.” Ada has nominally typed “pointers”, with bounds given by their lexical scope since you can declare new pointer types in both packages, in functions or procedures and inline in a declare block.

            Along with structs there’s “aggregates”, which are more or less tuples

            Aggregates are a syntax to assign to structures like a C designated initializer, Ada has no tuples. You can use hook up aggregate syntax to initialize your own custom containers and such with the Aggregate aspect.

            procedures and functions are different things

            Procedures are statements and functions are expressions. This prevents inadvertently forgetting about return values – there’s the whole [[nodiscard]] struggle that C++ has had to try to deal with this problem.

            There’s no building things up from basic concepts, just piling feature atop feature until you have an endless pile of one-off special cases

            I very much disagree with this. There’s types (really subtypes, but waves hands), subprograms (functions/procedures), packages, tasks and protected objects. Aspects and generics layer on top of that, the syntax gets a bit broad in all of the forms of these though.

            In fact you don’t define integers with particular sizes, you define them by ranges and let the compiler decide how big they need to be

            The ranges of something are usually more important, but you can force an integer to be a specific size by using declaring your integer with a bitwidth like Size => 16 or creating a new semantic from a sized integer type in the Interfaces package:

            type Bar is range 1 .. 10 with Size => 64;
            type My_Integer is new Interfaces.Integer_64;
            

            there’s basically no type inference

            I know I’m probably one of the few people who find this as a feature. Hindley-Milner makes it super easy to write and refactor code, but for readability it’s nice to have the types there. Unless I’m using a plugin while writing Rust, C++, or F# I find it difficult to understand what’s going on when types are omitted (here’s looking at you auto&).

            There’s a notion of “definite” vs “indefinite” types, which appear to be types whose size is vs. is not known at compile time; the latter can only be touched via pointers.

            You can use indefinite types without pointers. There’s a bunch of hand waving things where you can return dynamically sized arrays on the stacks and use these types on the stack provided the size is available when variables are assigned. GNAT uses a secondary stack to do a bunch of heavily lifting where otherwise you’d have to allocate to return something with an unknown size.

            Pointers/access types can usually only point to dynamic memory

            There’s a separation between “plain” access types which point to heap allocated memory, and access all types which can also point to the stack. It helps prevent holding an address to something on the stack.

            Freeing a single pointer is possible, but it modifies the metadata attached to the access type so a double-free doesn’t corrupt the heap.

            The standard is fuzzy about if the language uses fat pointers or not.

            Freeing a access type (“pointer-like”) sets it to zero automatically, which is a common C/C++ pattern. It doesn’t prevent free-after-use if you’re holding a second access type to the same thing. A lot of “can this be deleted” gets avoided since you have to explicitly instantiate a generic function to do the free.

            Aliasing where multiple pointers point to the same thing is also mostly not allowed.

            This is not true.

            This is basically what OO inheritance or Rust traits is for

            You can do dynamic dispatching by declaring a parameter type with 'Class, but otherwise you don’t get dispatching behavior. Like many other things in Ada 'Class here doesn’t mean class as in OOP, it’s a set of types closed under derivation, wherein “derivation” also means something slightly different in Ada than in OOP (sigh). Other than that, generic packages and functions are probably the closest thing for existential types.

            “controlled types” which loooook like C++-style RAII, complete with objects and copy constructors

            Yes, controlled types are C++ RAII. There are no actual copy constructors, but you can prevent objectcs from being copied by making them limited.

            You can write interface files by hand but it seems like you usually don’t have to, perhaps unless you want to declare private vs public functions and types? Oh, public types might also require interface files. Forward declarations are necessary if you need mutual recursion, but mutual recursion is pretty rare in general. There’s more to explore but it’s frankly so tedious and complicated that I have a hard time wanting to.

            There’s a declaration for procedures/functions and packages, and then a body definition. In GNAT a .ads file is like a .h file, and a .adb file is like a .cpp file, but since this is at the language level, there isn’t a preprocessor stitching things together. This assists in separate compilation and unless the specification is changed you don’t need to recompile other parts of the program using your code, though you’d have to re-link.

            You can write interface files by hand but it seems like you usually don’t have to,

            If your program has more than a single entry body, you need to write a spec for each package you make.

            Figuring out how the heck to put multiple functions into the same file is unironically difficult.

            --  example.ads
            package Example
                procedure Foo;
                procedure Bar;
            end Example;
            
            --  example.adb
            package body Example is
                --  empty procedures as example
                procedure Foo is begin
                    null;  --  we have to have a statement
                end Foo;
            
                -- shorthand for an empty procedure
                procedure Bar is null;
            end Example;
            

            I have a really hard time imagining myself getting good at it and thinking “boy this really DOES make reading code so much easier”

            It took me less than a week or two from knowing none of the language to reading the standard library and a bunch of code samples from other people.

            you need to jump around a lot to actually figure out what the hell anything

            Pretty much you use package specs like mini-docs while you’re working.

            The amount of work it takes to arrange and declare various functions and types probably makes refactoring hell

            Quite the opposite. All declaration sections operate the same way, so you can move functions and types super easily around the code. You also don’t need to rewrite your functions or procedures if you turn a struct into a class-like (tagged) for example, since they’re all declared like procedure (Self : My_Class; P1 : Param).

            pointers default to non-aliasing, so you can declare a pointer type with an annotation as aliasing

            Not sure what is intended here. The aliased keyword is to indicate a thing on the stack can be pointed to.

            1. 6

              Thank you for your very elaborate comment clarifying these points so well! I couldn’t have written it down nearly as well.

            2. 17

              One aspect I tend to over-optimize for is “will this language still be around (and popular) in 10+ years?” But based on how infrequently I actually reuse (or even build) my old code, this might be foolish.

              Having said that, other than C and C++, which languages will stick around and continue to be popular, cross-platform?

              Rust seems to have achieved critical mindshare and Zig’s tooling makes me optimistic about its future. Any others?

              1. 33

                Ada will still be around, and cross-platform…and probably about as popular as it is today. 😅

                1. 4

                  Ada seems like a niche language these days (maybe due to perceptions of it being old and difficult to pick up?), but that niche is apparently much bigger than I realized! It’s even in the top 25 on TIOBE. Thanks!

                  For what it’s worth, when I sampled Ada, I struggled to find approachable documentation.

                  1. 9

                    “Programming in Ada 2012” by John Barnes is really good.

                    1. 9

                      TIOBE is one of the worst language ranking, its methodology leads to high volatility and absurd results.

                      Better sources are the github PRs (Ada 194th), github commiters (Ada unlisted), SO survey (Ada 40th), JB survey (Ada unlisted), or /DATA report (Ada unlisted).

                      To be fair, the github data is biased toward open source (probably underestimating Ada), and SO/JB respondents are self-selecting (hard to say which direction that bias pulls). But any of those gives a more reliable picture than TIOBE/Pypl.

                      1. 1

                        Another good metric I like is the Stackoverflow Data Explorer, where you can query all of the questions asked about Ada. There’s no more than a couple hundred questions a year.

                        1. 3

                          Another: languish

                          btw hi! I still haven’t tried your beginner chocolate recipe but I think about it every month

                          1. 1

                            Obviously it is because Ada programmers already know everything.

                        2. 4

                          Ada+SPARK did seem pretty cool when I checked it out! I wish the Ada -> C compiler (‘GNAT Pro Common Code Generator’) was free-ly available. My main use case for this stuff is Wasm in the web and calling to libraries like Raylib and whatnot, and compiling to C is great for that.

                          1. 2

                            Ada and Raylib? Are you doing gamedev in Ada? If so (and even if not), is there anything you can share?

                            1. 6

                              The Twitch streamer “tsoding” developed a game with Raylib in Ada earlier this year. There’s a long thread on the Ada forum about it: https://forum.ada-lang.io/t/making-a-game-in-ada-with-raylib/704

                              1. 3

                                Oh, I was making the point that I wasn’t using the Ada because of not having (access to) C codegen. What I do have to share is my own C hacks that I use for codegen (to get generic types and static reflection) and something like basic borrow checks / linear types (this still has a couple soundness bugs, I gotta work on it more). But yeah, just meant to say that Ada would be great for this for me if the toolchain was compatible with the platform(s) I want to run it on.

                                Video of the game – has procedurally-generated bosses.

                                Regarding Ada documentation – https://learn.adacore.com/courses/intro-to-ada/index.html and the “Building High Integrity Applications with SPARK” book seemed decent?

                              2. 2

                                gnat is free and a part of GCC. It’s in the Debian and NixOS repos, and probably the repos for most other Linux distros. It can link with other object files created by gcc, and can use normal shared libraries pretty easily. The only reason I can think of to compile Ada to C first would be to support platforms not supported by GCC already.

                                The GNAT section of the GCC manual has a short overview of mixing Ada with other languages.

                                SPARK is really focused on hard real time and embedded systems, and even if the compiler were free, you probably wouldn’t want to use it to write desktop or web software. It’d be along the lines of using MISRA C for those use cases - you can do it, but it’s probably just making your life more difficult than it needs to be. Ada Core even considers it a different language entirely.

                              3. 4

                                In college I wrote most of my assignments in Ada, when I had a choice of language. The only code I still have is a ray tracer I wrote for my graphics class. It uses simple space partitioning (the “grid method”) to speed up the intersection tests, but it’s not multi-threaded or parallel at all, though it easily could be.

                                I wrote my compilers class project in Ada too, and I’d love to find that code, but I think it’s lost to time.

                                I remember “Ada Distilled” being really good, and there’s now an Ada 2005 version.

                                A few people have linked to Ada Core already, but adaic.org also has good resources.

                                The only Ada resource I’ve read recently was Concurrent and Real Time Progreamming in Ada and it’s pretty good, but doesn’t cover the basics of the language.

                                1. 2

                                  I struggled to find approachable documentation.

                                  https://learn.adacore.com/ is probably the best free stuff out there right now, but yeah there’s a lot of doc work to be done.

                                  1. 2

                                    on Arch Linux specifically Ada has a chronic problem of the tooling being extremely difficult to keep up to date with due to circular self-hosted dependencies in the compiler toolchain. If you’re on something like Nix you’ll probably find it much easier to get started.

                                    1. [Comment removed by author]

                                  2. 15

                                    Rust will still have to prove itself. There are many examples of languages that had a lot of attention and even industry mindshare, but ended up in irrelevance 10-15 years later (Ruby, Objective-C, Perl/Raku) for different reasons.

                                    A key aspect, I think, is the complexity of the infrastructure, language stability and availability of at least two reference implementations. These are weak points with Rust and might (!) kill it in the long run.

                                    1. 32

                                      I wouldn’t say that Ruby ended up in irrelevance.

                                      But of particular note, Rust is absolutely novel in being the first memory safe, non-managed language and it has taken the industry by storm. The only bigger wave I know of in PL popularity surge is from Java, which made tracing GCs popular/basically the default choice in the first place. These are technological advancements, most other languages are usually not novel from this perspective.

                                      1. 14

                                        Far from it, the site we are writing on is written in Ruby :-) There are also a bazillion jobs in Ruby, companies built on it, and active development on the interpreter

                                      2. 20

                                        Rust is already proven. It’s been nearly 10 years since the v1.0.0 release, and it’s doing better than ever. It’s still growing exponentially. Has major industry backing. NSA and DARPA prefer it over C and C++.

                                        There are languages more popular than Rust that have only one implementation, or only one that matters (Rust has a second minor one — mrustc). It’s possible that there will never be another fully-featured Rust front-end, but the cost of not having it are already known and largely mitigated. The GCC back-end doesn’t need another front-end implementation. There is a safety-certified Rust compiler, and even support for some niche/retro architectures.

                                        There are also upsides to not following C/C++‘s path: Rust isn’t slowed down by ISO’s bureaucracy, and doesn’t get dragged down by an uncooperative vendor.

                                        1. 2

                                          These are very good examples, thank you! It is still no guarantee, though, and Java is a good example. Languages can be industry-defining for 10-20 years and then just completely fall out of relevance (except in corporate contexts where legacy codebases are maintained much longer).

                                          Still, thanks for compiling these examples. I will refer back to them when needed.

                                          1. 24

                                            How do you measure relevance?

                                            Java is the primary application language of Android, the most used OS in the world. It’s generally one of the most used languages, ahead of Kotlin, Scala, Swift and Go.

                                            It’s not cool, and it’s not what Sun has hyped it to be, but IMHO still an incredibly successful language. It’s not the most exciting or innovative language, but neither is C.

                                            1. 3

                                              I thought Kotlin was the primary application language of Android?

                                              1. 3

                                                It’s the official recommended approach, and the new Compose UI pretty much requires kotlin (e.g. the @Composable annotation is like a macro, and requires kotlin source code).

                                                But the official APIs and a significant chunk of the android ecosystem is still java, or at least the bastard son of the real OpenJDK java, that lags behind way too much, unfortunately.. google really hasn’t been a good steward of the language, and there is almost a rift between dependencies that only use the tiny android standard lib subset vs the much larger OpenJDK ecosystem.

                                            2. 17

                                              With all due respect, you could have hardly chosen a worse example than this one.

                                              Java is more alive and active than ever, it’s one of the top 3 languages based on any ranking worth their salt, and has had significant improvements and investments over the last 5-10 years (pattern matching, algebraic data types, virtual threads, just to name a few).

                                              Basically every FAANG/top 100 company have a significant, if not all of their business critical backend systems written in Java, and guest languages haven’t made a significant dent in its marketshare, even among Greenfield projects. Apple’s backends, the whole Amazon cloud, Alibaba, a significant chunk of google, just to name a few are all huge Java shops.

                                              The sibling comment mentioned Android, but even with it being a relatively big market, it’s still tiny compared to the giant that is the “regular” java ecosystem.

                                              It’s quite easy to get sucked into our favourite ecosystem’s bubble, and from within objects may look bigger, than they actually are.

                                              1. 7

                                                I think anyone would be overjoyed if something they made nearly 30 years ago were as “irrelevant” as Java is today.

                                        2. 16

                                          There’s nothing super groundbreaking [about rust modules], it just pretty much works.

                                          I disagree :P Rust doesn’t have single global shared namespace, and it is a pretty big-brained idea! When a crate participates in compilation graph, the crate doesn’t have a name. The name is a property of dependency edge between two crates (so, the same thing could be known under different names to different users).

                                          This in turn unlocks the killer feature of “you can link foo 1.2.3 and foo 2.0.0 together”, without which we simply wouldn’t have sprawling dependency trees.

                                          1. 12

                                            This in turn unlocks the killer feature of “you can link foo 1.2.3 and foo 2.0.0 together”, without which we simply wouldn’t have sprawling dependency trees.

                                            Some may argue that that would be a good thing :P

                                            1. 3

                                              And Ruby is considering adding Namespaces which would allow Ruby apps to have giant dependency trees with multiple versions too, looking forward to that disasterfun!

                                              1. 10

                                                disasterfun should be a word to describe the general Ruby experience.

                                          2. 13

                                            Stripping the author’s nebulous “joy” and “dread” dimensions and focusing on the pure aspects considered, the results are:

                                            1. Zig 8.3 up from 7.9
                                            2. Rust 7.6 down from 7.8
                                            3. Odin 6.8 up from 6.4
                                            4. Ada 6.6 up from 5.7
                                            5. Hare 6.4 up from 6.2
                                            6. Jai 5.6 up from 4.75
                                            7. C 3.8 up from 3.7

                                            I think it’s interesting that excluding these dimensions does not change the ordering by much. The only change is that Hare and Ada swap places, because Ada was punished harshly on these je ne sais quoi dimensions. (Granted, some of this was obvious given the high correlation between a mean of ten values and the same mean but with two variables removed!)

                                            I also find it interesting that the joy and dread axes generally serve to bring a language’s score down, with the exception of Rust which is – apparently – good nebulously but in a way that’s hard to capture on features alone.

                                            1. 15

                                              One very good aspect of Rust that is not captured by this ranking (and indeed isn’t specific to systems languages) is the quality of the developer tooling. Rust’s compiler generally has good error messages, and the functionality cargo gives you for creating new projects from scratch, importing external libraries, etc. is good enough that even non-systems languages have taken inspiration from it.

                                              1. 2

                                                Was cargo revolutionary or was there something like that before it? It’s amazing.

                                                1. 23

                                                  Cargo was originally authored by the folks who had built Bundler, in Ruby, and had done a lot of JavaScript development and had seen what worked about npm.

                                                  It’s a combination of experience and being a third or fourth system.

                                                  1. 4

                                                    The go and npm tools are the closest/best predecessors, afaik. There’s been a wide variety of other such things, cargo mostly just learned from their mistakes. IIRC several core crates.io devs early on used to be npm developers.

                                                    1. 4

                                                      You’re directionally correct but incorrect in the details, see my sibling comment.

                                                      1. 6

                                                        Thanks for the corrections. “Directionally correct but incorrect in the details” is a great summary of my life. :-)

                                                2. 2

                                                  mmmm, statistics! Other fun filters to smooth out outliers a bit are “drop lowest and highest score”, or “use median instead of average” which I tend to prefer hence why I put it in there. Good observation that removing those brings scores up on average; I guess I’m just hard to please.

                                                3. 9

                                                  lots of things with no better home just get added to the list of over 120 compiler built-in functions,

                                                  I got a similar impression initially, but it flipped to the opposite couple of months in. The big thing is that std in Zig is not special. There are no lang items. So, builtins are the entire interface between compiler and user code, it’s just that this interface isn’t particularly well-hidden in the guts of standard library, but is just there to be used.

                                                  In a similar vein, the fact that @import is a reverse syntax sugar is cute. Import is syntax in a sense that the argument must be a sting literal, and you can’t alias the import “function”, even at comptime. It fully behaves as a custom import “path” syntax, except that there’s no concrete syntax for it and a function-call is re-used.

                                                  1. 3

                                                    reverse syntax sugar is cute

                                                    consider this term yoinked, I had been wondering what to call this lang-item-with-a-mustache.

                                                    1. 5

                                                      Not to be confused with syntax salt, which is about having dedicated syntax which deliberately induces friction!

                                                  2. 9

                                                    I love it! Minor minor nit in the Zig section:

                                                    Zig generics are just functions that can take and return types the same way as any other value. They’re just all evaluated at compile time, and RTTI fills in some gaps to let you do a bit of inspection of types at runtime as well.

                                                    I don’t think there’s any RTTI in Zig! (as evidenced by type not being something you can handle at runtime whatsoever.) All the fun type stuff happens at comptime, and you’re left with some pretty clean codegen.

                                                    1. 8

                                                      One of the complaints about Rust I’ve heard from people is that writing low-level code in Rust actually kinda sucks, and from my modest experiences with it I have to agree. Unsafe pointers have shitty ergonomics. Lots of small but necessary features tend to be locked behind unstable compiler features, which may never become stable. It’s real easy to accidentally cause UB or otherwise screw up in unsafe code. And it’s hard to understand the compiler’s assumptions about how things fit together under the hood. The docs help, and the story in general has slowly gotten better as the stdlib has improved and evolved, but it still often feels a lot like a second-class citizen. The feeling a lot of the time is that Rust is not really for writing OS and embedded code, it’s for writing high-performance applications.

                                                      I’ve also written a modest amount of low-level code in Rust, and I’m not as pessimistic about its suitability for low-level work. Unsafe pointers are more complicated and verbose to use than safe references are, but this is kind of a virtue I think - it encourages you to very quickly write safe abstractions around them so you don’t have to think about e.g. doing manual pointer arithmetic anymore, which is exactly the design pattern you want your embedded OS or similar to have.

                                                      1. 2

                                                        Depends. There’s certain “common” low-level access patterns that can’t be safely abstracted out in Rust:

                                                        • non-owning, and/or self-referential data structures (i.e. intrusive linked lists)
                                                        • non-linear borrows (i.e. ref/ptr passed to or returned by callbacks, without heap/owning alloc)
                                                        • interior mutability through an interface that forces transitive mutability (i.e. runtime shared UnsafeCell inside a Iter or Future - which methods require &mut self) (this one’s Rust specific)
                                                        1. 2

                                                          Eh… sort of.

                                                          The generic access pattern can’t (always) be safely abstracted out in rust. The specific version you can can almost always (every case I’ve ever seen) be wrapped in a safe API though.

                                                          1. 2

                                                            Have had the opposite experience:

                                                            I frequently make and use intrusive compute/IO systems which takes in user-supplied Task/Context nodes that hold callbacks. Pretty much requires either unsafe or heap allocation. io_uring is a good example.

                                                            The last bullet point seems to still be an issue for all intrusive Futures (most async sync-primitives). ATM, the effects are hacked around during codegen by disabling noalias or ignoring it if it trips the sanitizer. These attempt to address Pin<&mut Self> in Future, but not &mut Self in Iter.

                                                      2. 7

                                                        However, it doesn’t loooook like there’s any good way to do what type theory goons call “existential types”? This is basically what OO inheritance or Rust traits is for; “existential types” are basically any feature that lets you write fn foo(x: impl Something) and then call it with any type that has a Something trait which defines some interesting operations. The examples I can find for generics in Ada do things like swap values or concatenate arrays where the only operation done on the generic type is “copy”

                                                        Type theory goon nit: I think it would be better to describe this as “bounded” or “ad-hoc polymorphism” rather than “existential types”, although existential types are likely relevant in the implementation. A lot more languages support bounded polymorphism than explicit encodings of existential types.

                                                        Hence, for me, a language not having “existential types” is unfortunate, but less important than not having “bounded universally-quantified types” as an explicit language feature.

                                                        The provided Rust syntax is essentially sugar for a bounded universally-quantified type fn foo<T: Something>(x: T). Note that a signature with impl in output position instead (like fn foo() -> impl Something) is an existential type, and doesn’t have a simple encoding in terms of other language syntax.

                                                        As another example, you might consider wildcards (?) in Java as existential types, but you typically use bounded universally-quantified types in Java without needing ?.

                                                        It’s true that, from inside the function definition, the input universal type can be typechecked as an existential type (and an output existential type can be typechecked as if it were a universal type), but I don’t think it’s the standard way of naming the feature in question.

                                                        1. 2

                                                          Thanks. :D I still do not have a great idea of how the various zoos of type-theory concepts overlap and fit together, but all I can do is keep working on learning.

                                                        2. 6

                                                          The paragraph on D has a number of common misconceptions mixed up in it:

                                                          Both of these languages started off roughly in the “C#/Java/Go” level of abstraction and slowly worked their way downward.

                                                          D has had its full set of C-like features from the beginning, plus various common extensions like pragmas and inline asm. (This is why I don’t really like the term “system languages”, its definition is too vague…) So, if C counts, D should too. Many of the language lists C interop as a selling point, which D has also had from the beginning.

                                                          D, of course, offers more than C, things like the easy strings and garbage collector and such, also from day one, so I think it can also be called an “application language” if you wanted - the original D release was really recycled Java and C compiler components, and I think this hybrid nature is its big strength - but since you choose not to use those parts and play with your pointer arithmetic, inline asm, etc. from the first release, I can’t agree that it started at one thing and slowly worked its way downward.

                                                          Heck, if anything, I’d say it started low, and then tried to build more abstraction on top of it - the original release rejected most of what C++ did, no templates, for example, no references, no const - then those were tacked back on over time.

                                                          f I understand correctly it started with a GC, then made it optional, then realized that to make it optional you needed an entirely different stdlib, then had a community split over the different stdlib

                                                          D has always had a GC, from day one to present day. You don’t have to use it, like previously said, it also has all the same C functions (you can call malloc and free and cast the pointers yada yada yada, exactly as in C - this is where the marketing claim “optional GC” comes from), but it is there.

                                                          The stdlib fork had nothing to do with optional GC though. The forked stdlib (something that happened in 2004 and was officially reconciled in 2007… yet still gets talked about in basically ever D thread ever) was over a (mostly) closed door to outside contributions, not memory management philosophy. Both stdlibs embraced the garbage collector and were awkward to use without it.

                                                          1. 4

                                                            D, of course, offers more than C, things like the easy strings…

                                                            Both stdlibs embraced the garbage collector and were awkward to use without it.

                                                            …it also has all the same C functions (you can call malloc and free and cast the pointers yada yada yada, exactly as in C…

                                                            Hence why I don’t quiiiiiite consider D a “system language” on par with something like Rust or Zig. Sure you can mongle pointers if you want, but writing system-level stuff in it is no better than C. If I wanted to use hand-wrapped pointers pointers and the C stdlib while having to dodge the language features that use the GC under the hood, I could also get that in C#, Java, Common Lisp, Python and Lua. The lines in these artificial categories we use are definitely blurry though, I just had to draw them somewhere.

                                                            I’ll update the info about the stdlib though, thanks. My vague recollection was it was still a minor problem in like 2015 or something, but I’m not an expert. I can sympathize with bad press hounding one forever, and wouldn’t want to contribute to it. …more than I already have.

                                                            1. 2

                                                              My vague recollection was it was still a minor problem in like 2015 or something, but I’m not an expert.

                                                              OK, yeah, around that time there was a lot of talk of making a new stdlib that worked better freestanding, with memory allocation being pluggable. It never significantly materialized though; at most, there were a couple isolated projects that worked with the ideas, and they shunned any library that used the default GC, but it users who didn’t care about this could mix and match anything anyway, so it didn’t really cause a community split. (My big criticism is that all this talk - along with several other issues - paralyzed development; the D stdlib activity went down through this period and sat at essentially zero for years. There’s an attempt at revival now but I still am skeptical it will matter.)

                                                              1. 2

                                                                It’s an interesting take to call a language with a quite OK type system, slices, generics, destructors AND defer no better than C.

                                                                Sure, you can say the no-GC D ecosystem is not as developed because the community converged on using a GC and that for whatever reason you can’t have GC, but what you’re left with is still miles better than C IMO.

                                                                1. 2

                                                                  In theory one does not have to dodge the language features using the GC if one uses the “betterC” subset, that being a compiler flag.

                                                                  https://dlang.org/spec/betterc.html

                                                                  (look near the end of the page for retained vs disabled features)

                                                                  That turns off those features. If one then makes use of @safe, it turns off a bunch of other things (pointer arithmetic).

                                                                  The issue I see with use in this mode are largely that not many folks use it that way, it can be difficult to determine which portions of the D library will be available, and the @safe thing isn’t the default (due to backward compatibility).

                                                                  So an extra linting tool would be required to ensure the safe subset was being used.

                                                                  However one could certainly write e.g. an OS kernel in the betterC subset, while having some advantages over doing so in C.

                                                                  1. 2

                                                                    However one could certainly write e.g. an OS kernel in the betterC subset, while having some advantages over doing so in C.

                                                                    You can also write an OS kernel without that betterC switch… it isn’t really hard to dodge language features (or even just to use them; most the features actually do work just fine in a bare metal environment anyway!) but it I’d probably agree with the author that D doesn’t specialize in this stuff, just that it can do it.

                                                                    1. 2

                                                                      My point was mainly that the switch ensures one can not accidentally use anything outwith that subset.

                                                              2. 11

                                                                I think

                                                                Spatial safety: 10/10

                                                                for Zig is unfortunately a bit cope: not only does Zig still have a few cases of miscompilation that users hit practically, but the runtime check behavior they mention is only in debug mode. If you use any others, then you still have a lot of potential undefined behavior, and even worse the ReleaseFast mode has some wild footguns like assert becoming assume and introducing UB. As far as I know you can also still write stack use-after-frees even in debug mode.

                                                                Practically, hitting an assert due to uaf that debug mode “catches” is still bad ergonomics even if it no longer is exploitable as memory unsafety. And “memory safe only in debug mode” is worth a lot less to me than Rust: I wouldn’t claim that a c compiler that defaults to -fsanitize=address is a “10/10” spatial safety language, because I doubt a lot of people are running with that in production and the presence of it in debug doesn’t magically catch everything, and likewise don’t think that claiming Zig is 10/10 spatial safety is accurate.

                                                                1. 10

                                                                  but the runtime check behavior they mention is only in debug mode

                                                                  This is incorrect. There are four modes:

                                                                  • Debug
                                                                  • ReleaseSafe
                                                                  • ReleaseFast
                                                                  • ReleaseSmall

                                                                  Both Debug and ReleaseSafe come with safety-checks enabled.

                                                                  1. 3

                                                                    OOB checks are also present in ReleaseSafe mode, so no, it’s not debug-only. Same with integer overflow and other checks.

                                                                    and even worse the ReleaseFast mode has some wild footguns like assert becoming assume and introducing UB.

                                                                    That’s because that’s what assertions are designed to be in Zig. An assertion provides external knowledge to the compiler. If you want a crash when a given condition happens, you use @panic().

                                                                    As far as I know you can also still write stack use-after-frees even in debug mode.

                                                                    Yep, there is currently no checking for that. Valgrind will reliably catch those things though.

                                                                  2. [Comment removed by author]

                                                                    1. 1

                                                                      I’ve heard, and I won’t say who from 😉, that Garnet is exploring this space

                                                                      1. [Comment removed by author]

                                                                    2. 5

                                                                      I don’t really do much low-level programming, I’m a garbage collection enthusiast (my main professional language is C#, and my main hobby language is Common Lisp). So these comments come from that POV.

                                                                      1. I really like that Hare is designed around interoperability with other languages both ways, not just having a C FFI. So if someone writes some really amazing library in Hare, like a cross-platform GUI toolkit, it’s pretty easy for me to write a Common Lisp wrapper. The number 2 thing I dislike about Rust is that it feels like a graveyard for good ideas: people write interesting Rust libraries that serve some function that’s not Rust-specific , and then they’re unusable from anything but Rust.

                                                                      2. Overall, it seems like Zig is the closest to what I would want in a system language. Safer than Hare, not an exploding mess of accidental complexity like Rust. And from the point of view of a Lisp user, the fact that compile-time programming is done in the language itself is sweet, and the fact that so many other features are implemented in terms of comptime is elegant.

                                                                      All that said, I also feel like I’m not qualified to have opinions about this stuff, it’s just that Rust gives me The Fear in a way C doesn’t, even though I know how many footguns there are in C.

                                                                      1. 3

                                                                        I very much agree with 1. I’ve Bought In to the Rust ecosystem so it doesn’t affect me much in practice right now, but I have a looooong tradition of looking at various cool libraries and then going “oh but it’s C++ so I’ll never be able to use it”. You can certainly write a Rust lib that exposes a C FFI, and it’s not too tricky, but it’s very much a second-class citizen and would be very nice if it were the default.

                                                                        Also, frankly, a Lisper with experience wrapping non-Lisp libraries sounds like someone with a useful set of opinions! Much of the point of these languages is to more easily write nicer-to-use tools like Lisps or libraries for Lisp programs, after all.

                                                                        1. 2

                                                                          It will hopefully be easier to expose Rust libraries in a nice way to other languages once crABI becomes available.

                                                                      2. 4

                                                                        Forgive my ignorance if this is a dumb question, but I don’t see golang at all in this article aside from a passing reference to its mascot. Is that an intentional omission? Why?

                                                                        1. 10

                                                                          My guess would be that the author doesn’t consider it a ‘systems programming language’, perhaps due the mandatory garbage collection and consequences for performance or writing low-latency/high-throughput systems. It looks like the majority (all?) of languages under consideration don’t have GC.

                                                                          1. 4

                                                                            “systems” programming languages is essentially a completely undefined category that includes whatever you feel like. The author in this case felt that garbage collected languages aren’t system languages, and thus excluded them.

                                                                            1. 3

                                                                              The author describes their honorable mentions as consisting of “languages that aim for ‘lower-level than C#, Java or Go’ but ‘higher-level than C, Zig or Jai’”. In other words, the author thinks Go is too high-level to discuss in this post about low-level programming languages.

                                                                              I think Go’s main reason for being too high-level would be that it manages memory through garbage collection.

                                                                              1. 8

                                                                                It also requires a threading runtime and scheduler for goroutines. Not really a problem if you already have a GC, just more details.

                                                                            2. 4

                                                                              Sad that ATS doesn’t event make the “weird Indie shit” part of the page. I feel like its ideas were before the time where the popularity of those ideas became mainstream. I hope development of it continues.

                                                                              1. 2

                                                                                I’ve spent most of this year playing with non-mainstream (and historical) programming languages and, I’m surprised to admit, this is the first time I’ve heard of ATS. I think the closest I came is maybe Idris. Anyway, thanks for bringing this one up!

                                                                                1. 3

                                                                                  I’ve written a few posts about it that might help giving an overview of its wheelhouse if you’re interested.

                                                                                  1. 1

                                                                                    Thanks, and: wow, you’ve written about a lot of different programming languages!

                                                                                2. 1

                                                                                  I wish ATS was a little less impenetrable, because it looks like a really cool language with a lot of interesting ideas. It’s just that after having trouble getting even the Hello World to compile and remembering about keywords such as t@ype I would often get a bit too frustrated to continue. I think I did manage to get a few toy programs working, but nothing too meaningful.

                                                                                  At some point I wanted to write a wrapper for the compiler that would transform the errors into something a bit more legible, in the style of Elm… Maybe I’ll give that a go next time I feel like trying it out.

                                                                                  1. 1

                                                                                    The error wrapper would be very useful, they are a bit difficult to read at first.

                                                                                3. 4

                                                                                  While we acknowledge that C’s score cannot be too high, I feel that assigning “Spatial safety: 1/10” to C while 10/10 to Zig isn’t very fair. C gives implementations a lot of choices. An implementation could pick -ftrivial-auto-var-init=zero, either -fwrapv or -fsanitize=signed-integer-overflow, a hardened allocator like -fsanitize=scudo, in the future https://clang.llvm.org/docs/BoundsSafetyImplPlans.html

                                                                                  1. 5

                                                                                    None of those options do anything about spatial memory safety. FORTIFY_SOURCE=3 adds some limited bounds checking but it’s still very weak and can’t do much better without fat pointers. There’s limited support for associating bounds with a pointer in function arguments in standard C (but I have never seen it used) and there are extensions that add associated bounds to pointers in structures, but it’s all still halfarsed at best.

                                                                                  2. 4

                                                                                    Somehow C gets by without a single mention of integer overflows… which are extra special with auto-casting of int types and signed overflow being undefined. I guess the author was including that under spatial safety, and gave C a 1/10 on that anyways, but not overflows are a spatial problem.

                                                                                    Random anecdote: Earlier today I was reading some sqlite code, and was very confused about how there wasn’t trivial undefined behavior in the parser, until I tried to exploit it and discovered that they (partially) solved this problem by giving up on checking cases and instead limiting the size of allocations to 2^31 - 255 so that simple counts of (a subset of) a single array/string can’t accidentally overflow. Still not a perfect solution of course.

                                                                                    1. 4

                                                                                      If I mentioned every misfeature of C this article would be a whole book. Overflows are not exactly a spatial problem but I usually notice them the most when they result in an array index becoming invalid, so I grouped them in with that. All categories are artificial.

                                                                                    2. 4

                                                                                      Good stuff. I think this is the first time I’ve seen someone actually take a detailed look at Ada.

                                                                                      Maybe you could have “fearlessness” instead of “dread” as the name of the last item so you could have a more intuitive “bigger number means more of the thing” meaning for the score?

                                                                                      1. 4

                                                                                        You’re the second person to suggest that, so you’re probably right!

                                                                                        1. 1

                                                                                          Wait, best practices (tm) say you need one more call for it before refactoring it into a generalized solution :)

                                                                                          Seriously, this is very interesting and inspiring, and it confirms to me that I don’t wanna drag with rust, and that I do want to start playing with zig, and perhaps get back to learning more ocaml.

                                                                                      2. 3

                                                                                        @icefox,

                                                                                        Could you elaborate on the poor design choices Scala made that it will never shake off?

                                                                                        As far as I’ve seen Swift Seems Fine(tm) but it seems to have the Scala Problem where they made some bad design choices that they will never be able to shake off.

                                                                                        1. 2

                                                                                          Unfortunately, I don’t specifically recall what I was thinking of when I wrote that. :C I think someone had just complained at me about how Scala had like four different kinds of implicit arguments, but I really don’t know Scala as well as I should if I’m going to cite it as a source of problems. Another question I have is how much of Scala got cleaned up going from Scala 2 to 3, ’cause it appears that it did have some breaking changes. Maybe my next language survey should be “application languages”… Scala, Swift, etc.

                                                                                          1. 2

                                                                                            Unfortunately, I don’t specifically recall what I was thinking of when I wrote that. :C

                                                                                            No worries! My impression of Scala is that it has more of a process problem than a fundamental design problem: the language was (is?) attached to a university so it gets a lot of doctoral students that add features as part of their program. As a result, the features tend to be more complex, more academic, more narrowly defined, and not well documented so that they fit within a doctoral program.

                                                                                            One problem Scala had until recently was binary incompatibility. Minor versions were binary incompatible with each other so upgrades required big chunks of the ecosystem to move in sync. Unfortunately, long chains of dependencies made adoption of new versions slow since every package would have to wait for it’s upstream to migrate and some projects didn’t have a lot of staffing (e.g., ScalaJS was only a couple of folks and had a lot of downstream consumers). None of that is really a big deal if you are an academic, but it’s a nightmare if you are a large company.


                                                                                            I do know that there are some design issues with Swift that are pretty baked in. For example, it has pervasive type inference, function overloading, and protocols which make the type inference computationally complex. This forces people to break up longer lines of code to appease type inference. I think Chris Lattner even said it was a mistake on Software Unscripted.

                                                                                            Mostly, I was wondering if there was anything you knew about like that in Scala. I just get the impression that it has a death by a thousand cuts problem.

                                                                                        2. 3

                                                                                          Another one for the honorable mention pile is Chapel, a language designed specifically for supercomputers. I thought it would technically count as a systems language, but turns out it has automatic memory management. It also fails your other goal of “being good for gamedev”. Chapel is a weird language, but it’s got a lot of interesting ideas that might also apply to systems programming.

                                                                                          Another fun fact, the Lobsters developer also made the very first esoteric programming language, False

                                                                                          1. 4

                                                                                            Not important, but I thought Intercal was generally considered the first esoteric programming language.

                                                                                            1. 1

                                                                                              Oops, yeah I forgot about Intercal for some reason. False is the language that inspired Befunge and Brainfuck.

                                                                                            2. 1

                                                                                              the very first esoteric programming language

                                                                                              I’ve been wondering about this recently. There are clearly at least two kinds of esoteric programming language: those designed for some other purpose that turned out to be esoteric, and those intended to be esoteric for its own sake.

                                                                                              In the first few decades of computers, every programming language was esoteric to a large extent. Algol 60 led the way to non-esoteric programming and after about 1970 there was little need to use esoteric languages just to be able to make a computer do anything useful at all.

                                                                                              I’m not sure how to classify the systems that were invented in the study of the fundamentals of computation, things like combinators, Minsky machines, FRACTRAN, the Game of Life.

                                                                                              I like Christopher Strachey’s criticism of his own general-purpose macrogenerator (1965) which is very much, whoops I made an esoteric language, sorry! But judging by his showoff examples, he clearly enjoyed it. (Strachey wrote some of the earliest fun and frivolous software in 1951.)

                                                                                              It has been our experience that the GPM, while a very powerful tool in the hands of a ruthless programmer, is something of a trap for the sophisticated one. It contains in itself all the undesirable features of every possible machine code — in the sense of inviting endless tricks and time-wasting though fascinating exercises in ingenuity — without any of the irritating ad hoc features of real machines. It can also be almost impenetrably opaque, and even very experienced programmers indeed tend to spend hours simulating its action when one of their macro definitions goes wrong. Furthermore, it is remarkably good at using up machine time — fortunately the programs written for it are usually rather short.

                                                                                            3. 3

                                                                                              I think C needs to be thought of more as ‘building material for a language / toolset’ rather than a language / toolset in itself. I’ve had a decent amount of fun doing custom codegen (with a quasiquotation-like thing) for the specific case of generics I’ve wanted (basically just … dynamic arrays – but then also reflection for serialization etc. more generally than just ‘type parameters’) and trying to write a simple borrow checker for C. Codegen has honestly been much nicer than other language generics implementations where the compiler internally does codegen but … hides the generated code from you. With generated code that actually is materialized – you can LSP or debugger jump into the generated code and just read it, profiler shows decent function names, etc. Not having namespaces and function names being globally unique makes it so my LSP symbol search is actually pretty usable. The whole ‘method syntax’ thing in other languages ends up distracting me with the question of whether something should be a method or not. Overall that’s how I’ve felt about other langs – lots of cases where I end up having to make a decision that is orthogonal to what I’m actually designing because there are so many different ways of expressing the same thing.

                                                                                              Beyond C what I want is actually like – functional correctness (verification through proofs or model checking or such) which none of these other languages provide. I looked at Frama-C and also Verifast but didn’t stick for various reasons. So been exploring hacking on the Dafny C++ backend (it doesn’t support reals and generates a lot of shared_ptr) or trying some custom embedded DSL in Coq that extracts to C… There’s also RefinedC and VST etc. The verification thing may also work out as a way of implementing a formal verified borrow checker for C at least… (similar to the Rust stacked borrows model on Iris).

                                                                                              Would be cool to see a ‘system language’ that includes some decently expressive separation logic that helps you reason about functional correctness.

                                                                                              1. 5

                                                                                                ATS has proofs, linear+refinement types, & GADTs while compiling to C. It to be used to write “safer C” with an ML flavor. It interleaves the proof+type+term-level code in a way that can be more expressive than trying to separate the “functional correctness”.

                                                                                                1. 2

                                                                                                  Ah yes, I did look at ATS and read a bit of the book and forgot to mention it here. It does seem to have a lot of the features I’m interested in, but just felt too weird to me in terms of the keyword names etc. and things like LSP and other tooling is also helpful. The explicit proof object ergonomics felt a bit 🤔 (vs. the floyd-hoare / predicate transformer style assertions in Dafny). But all of these reasons aren’t super concrete, I should give it more of a try soon.

                                                                                                  There’s also F* with its Karamel compiler, Ada/SPARK, … A decent amount of choices in this space TBH.

                                                                                              2. 1

                                                                                                What about Fortran?