1. 83
  1.  

  2. 14

    For a related but slightly different perspective on PL history, see also @hwayne’s 10 Most(ly dead) Influential Programming Languages.

    1. 5

      I have no idea why COBOL is listed as mostly dead. There are way more COBOL jobs than Elm or OCaml and as many COBOL jobs as there are Clojure ones. COBOL is dead on Github, but it’s very much alive in the real world.

      1. 7

        The entire thing was a spite-inspired by the original article leaving off Pascal because it was “mostly dead”, so I tried to fit everything into the “mostly dead” gimmick.

        1. 1

          Of the four mother languages, ALGOL is the most “dead”; Everybody still knows about LISP, COBOL still powers tons of legacy systems, and most scientific packages still have some FORTRAN. But I’ve met plenty of programmers who haven’t even heard of ALGOL. You’d think it’d be the least important of the mother languages, but it’s the opposite. Of the four, only LISP comes anywhere close to the pervasive importance of ALGOL.

      2. 10

        Two more important ur-languages!

        • Something that represents spreadsheet programming (one of the most popular programming paradigms in the world)
        • SQL

        Misc thoughts:

        • Arguably there’s enough difference between constraint solving languages and logic languages, but I think Picat is the only constraint language that’s also a full programming language.
        • XML based languages? HTML is the dominant one, but there are a few other niche ones out there, mostly mediated through a GUI abstraction.
        • I’ve often wondered what’s the problem that most clearly shows the difference between paradigms. My best guess rn is “given a list of integers, return the squares of the odds”. At the very least it showcases the differences between imperative, functional, and array paradigms, maybe also logic?
        1. 4

          Spreadsheets: programming model is reactive programming, related to dataflow as mentioned elsethread. Data model is flat tables, basically commensurate with SQL.


          SQL I think is definitely prolog-esque. @Sophistifunk says in the sibling:

          the value in SQL is getting data you put in back out

          That’s true in a sense, but not really. Part of the value provided by relational databases is a high degree of reliability; but you might as well say that the value in ZFS is getting data you put in back out. Which is somewhat true, but ZFS is not a programming language but a storage model.

          The value of prolog is not in deriving novel facts, but fairly boring ones derived from those which you input. The same is true of SQL. Any time you do something like perform a join, you are deriving new facts from the data you input.

          1. 1

            fwiw I meant “novel” as in “you didn’t enter them” not as any sort of value judgement. Joins in SQL are mostly just a convenience to reverse the act of normalisation, and I think the search-engine-esque aspect of Prologs are what separate SQL tuple stores from them, despite a shallow resemblance in the fact that they both aim to be declarative in nature. But it’s not something I’m adamant about, I think there’s a reasonable case to be made either way and we’re probably arguing about the colour of the shed.

          2. 3

            I agree on both of these, and would lump dataflow in with Spreadsheets. Somebody suggested SQL would go in with Prolog, but I’m a hard-disagree on that. The essence of SQL is defining relations and selecting subsets of them, whereas prolog is about defining facts and then rules from which novel facts may be derived; the value in SQL is getting data you put in back out, whereas the value in Prolog is surfacing the new data implied by the facts and rules you put in.

            1. 2

              Came here to say this. Note, you probably mean SGML-based rather than XML-based. And egads, SQL is super important!

              That said.. What is the lineage of SQL? …Oh, Prolog, by way of “Datalog”, according to Wikipedia. So, this one is covered by one of the ur-languages. Neat.

              1. 3

                I don’t think being either SGML based or XML based is likely to affect the language category. That’s just syntax.

            2. 10

              Super weird to see Ruby listed as an ALGOL when it is clearly a Self…

              1. 12

                Ruby Is often misunderstood because of its syntax. It looks like ALGOL but also has lots of sugar borrowed from Perl, so you’ll have people claiming it’s either of those when underneath all the sugar it’s Smalltalk/Self.

                1. 2

                  The one thing ruby doesn’t have is true.if_true { beh } style for conditionals, but you’re supposed to avoid conditionals anyway and it does use message form for loops (usually)

                  1. 4

                    I remember watching a talk by Sandi Metz that mentioned it and how to implement it, it’d actually very easy. Then she told people not to do it because Ruby has an if expression and you should just use that.

                    I’m also pretty sure that the for statement was just sugar for .each and I can’t think of a reason why I’d use it over .each, maybe combined with ranges or .times anyway.

                2. 1

                  I only did one project in production with Ruby (the Splunk SDK for Ruby). Its core might let it operate at a Self, but in practice I found it to be a clunky ALGOL.

                  1. 5

                    You can if course write code in ay style in any language. I just pulled up some of the splunk sdk code and the first thing I see is parameterless messages being sent using x.y() and defined using def foo() so something tells me this wasn’t built by rubyists…

                    I definitely on further inspection found some rather long and nesty method definitions. It’s not the Java-in-Ruby I feared from your comment (I’ve worked in too many if those…) but definitely not taking advantage of small objects or dynamic dispatch to solve problems as much as ruby lends itself to.

                    But I could write similar code in Smalltalkbif I desired. At a co-worker’s old company they had a component written in smalltalk where array iteration was done my mapping over a range of indices and indexing into the array because the person who had written the Smalltalk was a C programmer. This ruby code is much better than that Smalltalk code ;)

                  2. 0

                    Ruby is a Smalltalk, and Smalltalk is an ur-language.

                    The author lists Smalltalk as an example of a Self language, but Smalltalk predates Self by a couple of decades. The author argues that this is logically sensible, because Self is the purer form.

                    The distinguishing difference between Smalltalk and Self is “everything is an object” vs “everything comes from a prototype”. They are sufficiently different that someone who knows a class-based dynamically typed language (e.g. Ruby) will find a prototype-based language (older versions of Javascript) very foreign.

                  3. 10

                    Make is the Prolog language most journeyman programmers know.

                    Weird to call the Smalltalk group “Self”. That’d be like calling the Algol group C or the ML group Haskell.

                    I would add assembly as an 8th kind.

                    1. 9

                      I chose Self because I regard classes as vestigial. Prototype based systems are more in your face about making you give up on patterns from other families, so I chose it as the type specimen.

                      1. 5

                        SQL also belongs to the Prolog family, though you’re right that Make is closer than SQL is.

                        What makes you say assembly doesn’t belong in the ALGOL bucket? asm and the other ALGOLs are imperative: there’s an instruction pointer which runs commands in sequence and jumps around. The instruction pointer has a lot more freedom to jump around in asm than it does in most other ALGOL languages but the fundamentals are identical.

                        1. 2

                          Right, but Assembler is both highly dependent on what system you are using, and also on the whole only has one or two ideas that also happened to be in ALGOL. Both were influenced by how computers used to be built, which was a memory drum with a current position.

                          1. 1

                            ALGOL introduced structured programming, which is not something assembly lends itself to.

                        2. 9

                          I’d really recommend Haskell rather than ML for learning about functional programming (though I’d probably make the opposite recommendation for actually getting work done). Haskell has no mutable state except that exposed via monads, whereas ML just encourages a programming style that reduces mutable state, which often ends up feeling like it’s just an imperative language with a bunch of annoying constraints.

                          I’m curious why Self was recommended over Smalltalk, but either is probably fine.

                          When I wrote a list like this, I also included Prolog, but I’d be tempted to replace that now with Z3. Prolog is often given as the exemplar of ‘logic programming’ languages, but it’s almost the only member of that set. Very few general purpose languages live in this tree but a lot of proof-assistant and solver systems fit broadly into the same category as Prolog. Of these, Z3 is probably the easiest to pick up and play with and is also trivial to embed in other languages when it’s actually useful.

                          I’d also be tempted to put Idris on the list. Like Haskell, it’s mentioned in the ML section, but learning a dependently typed language makes you a much better C++ programmer even if you never write any serious Idris or Agda code. Dependent types are another one of the abstractions that aren’t that common in general purpose languages but are very common in proof assistants. I’m almost tempted to suggest that Coq should be on the list. You can spot the real theoreticians because they’re the only people that don’t smirk when someone says ‘I’d like to show you all my Coq’ in a meeting.

                          1. 5

                            For logic programming languages, the author mentions Mercury and Kanren. Oz is another good example. Kanren has its own progeny; miniKanren is a popular DSL and the µKanren pattern can be used to quickly embed miniKanren into arbitrary irreversible programming languages.

                            1. 3

                              I think you misunderstood the post. It does not recommend ML or Self but rather use them as names for categories/families of languages and suggests that so long as you experience one of each family it does not matter which one you will still get the core benefit of exposure to the family

                              1. 3

                                It does in fact specifically recommend learning Haskell rather than ML at the bottom of the page.

                                1. 1

                                  In that case I find it even more curious that they chose a direct Smalltalk descendant as the ur-language for Smalltalk-family languages.

                                  1. 2

                                    Smalltalk-72 is not object-based like Smalltalk-80; there’s no single dialect of Smalltalk which would be a good candidate for the ur-language. Additionally, the design of Self went hand-in-hand with the invention of JIT technology, a hallmark of many modern Self descendants like Java and ECMAScript.

                                    1. 5

                                      Smalltalk-72 is not object-based like Smalltalk-80; there’s no single dialect of Smalltalk which would be a good candidate for the ur-language

                                      Smalltalk-80 is generally regarded as the canonical version of the language, with the prior ones as prototypes. Although a lot of the ideas were present in ’76, I think ‘80 is a good candidate here. It’s the one that the authors wrote books about, including the Blue Book, which is still a fantastic reference today for how to implement an interpreted language with dynamic dispatch and garbage collection.

                                      Additionally, the design of Self went hand-in-hand with the invention of JIT technology, a hallmark of many modern Self descendants like Java and ECMAScript

                                      An interesting claim given that the most popular JavaScript JIT is v8, which is a fork of Anamorphic Smalltalk, a Smalltalk JIT. Although a lot of the ideas were cross-pollinated between Smalltalk and Self, the most popular modern JavaScript JIT is a descendant of not just the ideas of early Smalltalk JITs but of the implementation as well.

                                      1. 3

                                        Java is hardly a Self-descendant; it falls pretty clearly in the Algol camp. Classes in Java are only superficially similar to those of Self.

                                      2. 1

                                        The author did cover that point too:

                                        Smalltalk inherited the notion of a value and its type from earlier languages, and implemented the idea of a class. All objects had a class that gave their type, and the class was used to construct objects of that type. Self disposed of the notion of class and worked solely with objects. As this is a purer form, I have chosen Self as the type specimen for this ur-language.

                                        He also recommended learning Haskell as a first language in the ML family.

                                        1. 7

                                          I think he’s slightly missing something key from Smalltalk: Smalltalk classes are just objects. They’re objects that provide a recipe for defining another object but, unlike a lot of later Smalltalk- and Simula-family languages, Smalltalk is purely imperative. Java and C++, for example, have a declarative sub-language for defining the shape of classes. In Smalltalk you create a class by sending a #subclass: message to an existing one and then you send more messages to add instance variables (fields) and methods to it. You create instances by sending another message.

                                          You can run Smalltalk on a Self VM and vice versa. V8 takes a prototype-based language and then does a hidden class transform to define classes for objects that have the same layout and set of methods and actually implements everything on something that looks more like a Smalltalk VM (and began life as one).

                                          The change from Self to allow any object to take the role of a class was fairly small. The bigger changes in Self were the idea of differential inheritance (where you inherited values of fields, not just methods) and multiple inheritance (Self allows multiple prototype chains). JavaScript provides the first of these (though with recent ECMAScript versions it also provides classes because most people tended to use prototypes to implement classes) but I’m not aware of any Smalltalk-family language that implements the latter. Note that you can implement both of these abstractions in libraries in Smalltalk, if you use accessors for ivars then you can delegate to another object if you don’t have an implementation and you can use #doesNotUnderstand: to implement second- and third-chance dispatch mechanisms.

                                          1. 2

                                            Oh, that’s really interesting - thank you.

                                            Even if the author were correct, i.e. Self exemplifies a “purer form” of the Smalltalk/Self family of languages, it would still be a little inconsistent to choose Self as the ur-language in that case but ML rather than Haskell for the functional-programming family. Surely Haskell exemplifies a “purer form” of that? I can see why you found the choices curious.

                                            Still, I think the main point is that it doesn’t really matter what you call them: they are just labels. The important thing is that there are different families of programming languages and, if you want to broaden your programming experience, it is better to try languages from multiple different families rather than multiple languages from the same family.

                                            Personally, I am far more concerned by this bit:

                                            I’ll name them for a type specimen, the way a species in archaeology is named for a particular fossil that defines it.

                                            Surely he means palaeontology?

                                  2. 8

                                    Missing: Data flow languages represented by Labview, Lusture etc.

                                    1. 1

                                      Interesting suggestion. This may just be a gap in my background. What would you say the key mental structures you get from them are?

                                      1. 3

                                        Dataflow programming is about modelling continuous systems. The focus is on how the data flows rather than the control flow in other languages. This answer explains it very well.

                                        1. 1

                                          Is there something there that isn’t already present in, say, generators in Python or laziness and composition of functions in Haskell?

                                          1. 2

                                            The entire concept is orthogonal to them, despite the way everything coalesces in the metal. Dataflow programming is much closer to excel than anything else in this list.

                                            1. 1

                                              Thank you. I’ll do some more studying and consider how to update the list.

                                        2. 2

                                          LabVIEW is the predecessor of Unreal’s Blueprints if you want to try these ideas out in a modern setting. It’s also the origins of reactive programming.

                                      2. 6

                                        To have a “complete” programming education, it’s good to learn a language from the important families, which are described as ur-languages here. Later on it becomes easy to switch between languages in the same family without much difficulty depending on the application.

                                        I have written a bit of OCaml, Common Lisp, Raku Clojure, and brief excursions in Prolog and K (recently BQN has become a notable new addition to the family).

                                        Some languages are multi paradigm like Common Lisp, Clojure and Raku which allow me to pick and choose the paradigm depending on the problem at hand.

                                        At some point would like to try out a stack-oriented language, but doesn’t seem like any of them are actively being used to build software, even though implementations are being created all the time.

                                        1. 3

                                          Ladder logic is an esoteric family of languages which doesn’t quite fit into any other language here. Ladder logic allows you to specify state machines and all the transitions to other states but in a way so different from ML that it don’t really belong in the ML family.

                                          1. 3

                                            I would add assembly (and various IRs such as LLVM IR, .NET IL, and Java bytecode). You probably don’t need to write it, but it’s useful to be able to read it.

                                            1. 1

                                              Is there something in assembly that you wouldn’t get in the ALGOL family at this point? I agree that it’s useful to be able to read it, and it forces you to learn how things like the stack and heap and the like are implemented, but is there a mental structure of computation that you learn from it that’s unique?

                                              1. 4

                                                Assembly is a GOTO soup in a way that ALGOL’s aren’t, which is a mode of thinking that most ALGOL languages, outside of BASIC, don’t really help you learn.

                                                1. 1

                                                  Essentially, how to program without structured programming. At one point in time I would have said “the model of computation used by your CPU,” although that’s less true nowadays. Still the closest thing conveniently available to us, though.

                                              2. 2

                                                I think it’s a mistake to group objective orientation completely under self (although it discusses SmallTalk a lot in the commentary for this section). Those two are message passing / evented object oriented systems, polymorphic by shared method interfaces, and as noted build on a persistent system state, and clearly represent a distinct family.

                                                The bulk of what people consider to be ‘object oriented’ programming after that inflection point though is the C++ / java style where objects are composite static types with associated methods and polymorphic through inheritance heirarchies - I think this comes from Simula and I think this approach to types and subtypes could be important enough to add to the list as an 8th base case.

                                                1. 4

                                                  I wouldn’t group C++ and Java like that. Java is a Smalltalk-family language, C++’s OO subset is a Simula-family language (though modern C++ is far more a generic programming language than an object-oriented programming language).

                                                  You can implement Smalltalk on the original JVM by treating every selector as a separate interface (you can use invoke_dynamic on newer ones) and Redline Smalltalk does exactly this. You can’t do the same on the C++ object model without implementing an entirely new dispatch mechanism.

                                                  Some newer OO languages that use strong structural and algebraic typing blur the traditional lines between the static and dynamic a lot. There are really two axes that often get conflated:

                                                  • Static versus dynamic dispatch.
                                                  • Structural versus nominal typing.

                                                  Smalltalk / Self / JavaScript have purely dynamic dispatch and structural typing. C++ has nominal typing and both static and dynamic dispatch and it also (via templates) has structural typing but with only static dispatch, though you can just about fudge it with wrapper templates to almost do dynamic over structural types. Java has only dynamic dispatch and nominal typing.

                                                  Newer languages, such as Go / Pony / Verona have static and dynamic dispatch and structural typing. This category captures, to me, the best set of tradeoffs: you can do inlining and efficient dispatch when you know the concrete type, but the you can also write completely generic code and the decision whether to do static or dynamic dispatch depends on the type information available at the call site. Your code feels more like Smalltalk to write, but can perform more like C++ (assuming your compiler does a moderately good job of reification and inlining, which Go doesn’t but Pony does).

                                                  1. 4

                                                    From the implementation side yes, the JVM definitely feels more like Smalltalk. But is Java really used in the same dynamic fashion to such an extent that you could say it too is Smalltalk? Just because it’s possible, doesn’t mean it’s idiomatic. I’d argue that most code in Java, including the standard library/classpath, is written in a more Simula-like fashion, the same as C++, and would place it in that same category.

                                                    1. 6

                                                      Interfaces, which permit dynamic dispatch orthongonal to the implementation hierarchy, are a first-class parts of Java and the core libraries. Idiomatic Java makes extensive use of them. The equivalent in C++ would be abstract classes with pure virtual methods and these are very rarely part of an idiomatic C++ codebase.

                                                      Java was created as a version of Smalltalk for the average programmer, dropping just enough of the dynamic bits of Smalltalk to allow efficient implementation in both an interpreter and a compiler. C++ was designed to bring concepts from Simula to C.

                                                      1. 3

                                                        Interesting replies, thanks. The point about Java dispatch is interesting and suggests it is not as good an example as I thought it was. (I’ve not really used it extensively for a very long time). The point I was trying to make was for the inclusion of Simula based on the introduction of classes and inheritance, itself an influence on Smalltalk. I accept that simula is built on Algol, and maybe that means it’s not distinct enough for a branch within this taxonomy. I would note that both Stroustrup and Gosling nominate Simula as a direct influence example citation

                                                        (NB: I always thought of java as an attempt to write an objective-C with a more C++ syntax myself, but that’s just based on what seemed to be influential at the time. Sun were quite invested in OpenStep shortly before they pivoted everything into Java)

                                                        1. 4

                                                          (NB: I always thought of java as an attempt to write an objective-C with a more C++ syntax myself, but that’s just based on what seemed to be influential at the time. Sun were quite invested in OpenStep shortly before they pivoted everything into Java)

                                                          And Objective-C was an attempt to embed Smalltalk in C. A lot of the folks that worked on OpenStep went on to work on Java and you can see OpenStep footprints in a lot of the Java standard library. As I understand it, explicit interfaces were added to Java largely based on experience with performance difficulties implementing Objective-C with efficient duck typing. In Smalltalk and Objective-C, every object logically implements every method (though it may implement it by calling #doesNotUnderstand: or -forwardInvocation:), so you need an NxM matrix to implement (class, selector) -> method lookups. GNU family runtimes implement this as a tree for each object that contains every method, with copy-on-write to reduce memory overhead for inheritance and with a leaf not-implemented node that’s referenced for large runs of missing selectors. The NeXT family runtimes implement it with a per-object hash table that grows as methods are referenced. Neither is great for performance.

                                                          The problem is worse in Objective-C than in some other languages for two reasons:

                                                          • Categories and reflection APIs mean that methods can be added to a class after it’s created. Replacing a method is easy (you already have a key->value pair for it in whatever your lookup structure is, but adding a new valid selector means that you can’t optimise the layout easily).
                                                          • The fallback dispatch mechanisms (-forwardInvocation: and friends) mean that you really do have the complete matrix, though you can optimise for long runs of not-currently-implemented selectors.

                                                          Requiring nominal interfaces rather than simple structural equality for dynamic dispatch meant that Java could use vtables for dispatch (like C++). Each class just has an array of methods it implements, indexed by a stable ordering of the method names. Each interface has a similar vtable and nominal interfaces mean that you can generate the interfaces up-front. It’s more expensive to do an interface-to-interface cast, but that’s possible to optimise quite a lot.

                                                          Languages that do dynamic dispatch but don’t allow the reflection or fallback dispatch mechanism, but still do structural typing, can use selector colouring. This lets you have a vtable-like dispatch table, where every selector is a fixed index into an array, but where many selectors will share the same vtable index because you know that no two classes implement both selectors. The key change that makes this possible is that the class-to-interface cast will fail at compile time if the class doesn’t implement the interface and an interface-to-interface cast will fail at run time. This means that once you have an interface, you never need any kind of fallback dispatch mechanism: it is guaranteed to implement the methods it claims. Interfaces in such a language can be completely erased during the compilation process: the class has a dispatch table that lays out selectors in such a way that selector foo in any class that is ever converted to interface X is at index N, so given an object x of interface type X you can dispatch foo by just doing x.dtable[N](args...). If foo appears in multiple interfaces that are all implemented by an overlapping set of classes, then foo will map to the same N. If one class implements bar and another implements baz, but these two methods don’t ever show up in the same interfaces then they can be mapped to the same index.

                                                          Smalltalk has been one of the big influences on Verona too. I would say that we’re trying to do for the 21st century what Objective-C tried to do for the ‘80s: provide a language that captures the programmer flexibility of Smalltalk but is amenable to efficient implementation on modern hardware and modern programming problems. Doing it today means that we care as much about scalability to manycore heterogeneous systems as Objective-C cared about linear execution speed (which we also care about). We want the same level of fine-grained interoperability with C[++] that Objective-C[++] has but with the extra constraint that we don’t trust C anymore and so we want to be able to sandbox all of our C libraries. We also care about things like live updates more than we care about things like shared libraries because we’re targeting systems that typically do static linking (or fake static linking with containers) but have 5+ 9s of uptime requirements.

                                                          1. 1

                                                            Fascinating reading again, thanks. I had not previously heard of Verona, it sounds very interesting. Objective-C was always one of my favourite developer experiences, the balance of C interoperability with such a dynamic runtime was a sweet spot, but the early systems were noticeably slow, as you say.

                                                  2. 1

                                                    It’s because Java and C++ are both ALGOL family with something called “objects” in it. Neither have enough unique features to warrant a family or being part of anything but the ALGOL group.