1. 2

    It would be interesting to see a follow up to this study given that Rust was only in the second tier of all the energy/time/memory groupings. Rust has moved in the rankings a bit on the CLBG since this study was created so it things might be shuffled a bit.

    1. 2

      This paper was published by the group in early 2021: http://repositorium.uminho.pt/bitstream/1822/69044/1/paper.pdf. It includes new measurements and evaluations based on Rosetta Code (in addition to CLBG). According to the paper, the measurement results of both code sources are comparable. I made some additional evaluations based on the geometric mean of the time data normalized to C though and get a different order than the one in Table 4 of the paper, see http://software.rochus-keller.ch/Ranking_programming_languages_by_energy_efficiency_evaluation.ods. The paper seems to have used arithmetic instead of geometric means, which is questionable given http://ece.uprm.edu/~nayda/Courses/Icom5047F06/Papers/paper4.pdf.

      1. 1

        It seems like this is an extension of the previous paper. I’m not sure if they made any changes to the CLBG sections. Couple of tells:

        • both papers group Ruby with functional languages, but not with object oriented languages
        • the tables don’t seem to be changed (on casual inspection)

        Still, it is extended with Rosetta examples so this is interesting. Thanks for posting it!

    1. 4

      As it stands, Nim is well on its way to becoming as complex as Ada or C++. I guess the price for “one programming language for everything” is that you then have to satisfy everyone, which is only possible if the language becomes continually more extensive.

      1. 3

        That’s a fair critism, but I came to Nim from C++ because I thought most of the C++ complexity was ad-hoc and unjustified.

        1. 2

          It’s worth taking a look at how C++ came to be. Stroutrup added concepts to the existing C language that he knew from Simula and found useful. In this sense, “C with classes” was minimal and complete. Ada, on the other hand, started with the claim to support all applications of the DoD at that time with only one language (i.e. “one programming language for everything”); already the first version of Ada was accordingly large and complex. In the meantime, C++ has also reached an almost incomprehensible size and complexity. With C++ 11, a lot of “syntactic sugar” was introduced, i.e. things that could already be done, but perhaps somewhat less elegantly; this trend continued and the result we see in C++17 and 20; the price is an ever larger language scope. How much is enough (i.e., optimal) is a difficult question. At the moment I am trying to find an answer to this with http://oberon-lang.ch.

          1. 1

            I believe that Stroustroup planned for C++ to be multi-paradigm from the beginning, at least that’s the take I got from his book. C++ just happened to luck into the OO craze and I guess that paradigm became dominant.

            1. 2

              “multi-paradigm” is not the same as “one programming language for everything”. C++ was “multi-paradigm” by construction in that OO features were added to a procedural language without removing anything. But the new features were not just useful for OO, but also e.g. for better modularization and resource management.

            2. 1

              oberon+ looks neat (and reminds me of ada but presumably a lot less complex). But I can’t find any resources for learning it or any information about any standard libraries it has. Do the standard libraries use camelCase (as shown in the examples) this would also be a blocker for me.

              1. 2

                Do the standard libraries use camelCase (as shown in the examples) this would also be a blocker for me.

                honest question, just curious: why is casing so important for you that you basically ignore all other technical merits of a language?

                1. 1

                  I find it hard and aesthetically displeasing to read. (Also worth noting that languages which like to rely on camel case often pick the wrong side of XMLHttpRequest.) I find it hard to type on a keyboard. Given the choice of learning a language or not dealing with camelCase I simply pick not to learn the language. There are in fact so many languages out there that it is difficult to “miss out” on much by making such an arbitrary choice. There is likely at least one more language out there with mostly overlapping technical merits which does not force camelCase upon me.

                  That being said, I have recently thought about investigating using something like treesitter to basically place a thin layer over the top of a language which can translate snake_case to camelCase using a variety of rules (or even by also communicating with an LSP server) so that I can learn a language like Haskell comfortably.

                2. 1

                  Oberon+ is a union and extension of the existing Oberon 90, Oberon-2 and Oberon-07 dialects (see https://en.wikipedia.org/wiki/Oberon_(programming_language)). Historically there is no “standard library” for Oberon, but there is the Oberon System, which is a full operating system with a lot of modules also available for custom applications. There was an initiative to define Oberon System independent standard libraries (see http://www.edm2.com/index.php/The_Oakwood_Guidelines_for_Oberon-2_Compiler_Developers) which my compiler supports. But I will eventually implement my own standard libraries; up to then you can use any C library by the foreign function interface built into Oberon+; at http://oberon-lang.ch there is a language specification and some other articles; see also https://github.com/rochus-keller/Oberon.

            3. 2

              From what I can tell, the feeping creaturism is a hell of a lot more integrated than in other languages I’ve seen, and all of the language grammar in the above article is pretty well thought-out.

              1. 3

                looking forward to that Oberon+ compared to Oberon-2 blog post, and hopefully come comparisons to modula-3 as well.

                1. 2

                  Here is the link: https://oberon-lang.github.io/2021/07/16/comparing-oberon+-with-oberon-2-and-07.html; there are also other post about design details. Modula-3 is a quite different and more complex (and more old-fashioned) language.

                1. 10

                  This is a pretty remarkable feat! The author has also created a compiler for the Oberon-07 language which also targets the same VM: https://github.com/rochus-keller/Oberon/ (This one is especially interesting because it’s compiling a language that supports pointers onto a runtime built for a language which does not have pointers.)

                  A full run of Benchmark testStandardTests takes 142 seconds with the C++ version and 161 seconds with the LuaJIT version. It is fair to conclude that the same program takes only a factor of 1.13 more time when run on LuaJIT compared to the native version.

                  Very impressive numbers.

                  1. 2

                    There’s an awful lot of C++ in these ostensibly lua projects, so I wonder if it makes heavy use of lightuserdata & uses FFI + native code for performance-critical pieces (but I haven’t looked too closely). It may be that the native code is just for the GUI stuff though…

                    1. 2

                      ostensibly lua

                      There’s no real reason that the compiler implementation and the compiler target need to match, but I agree that it’s weird to use so much C++ when a much more pleasant language is just … sitting right there.

                      1. 2

                        Well, while lua is very nice to use, it has various limitations that you’d naturally hit in a project like this. (For instance, there’s a hardcoded limit for how many string keys can exist in a table – which I tend to hit whenever I do markov models.) Also, a stock lua install isn’t going to have built-in support for things like sockets or graphics (and until luarocks became really common, it was easier to just roll your own bindings than to install a third party one).

                        LuaJit has standarized on 5.1 support last I heard, which means that it’s lost compatibility with libraries that have moved on to 5.2 and 5.3 or libraries that targeted mainline lua after the 5.2 release. Meanwhile, LuaJit has extended plain lua with a much nicer FFI system that makes writing c-side bindings largely unnecessary.

                        If I were doing a project like this, I would write as much as I could in pure lua & then write really minimal C or C++ for whatever was leftover (either to make bindings to graphics libraries and such, or to create lightuserdata wrappers around pointers and routines for pointer manipulation on those values). This appears to be what the author did, more or less, but I’m not sure because I haven’t looked closely.

                        (One downside to using FFI is that you’re then stuck with LuaJit, & it makes it harder to target other systems. In my last lua-based language project, I wanted to support LuaJit, plain Lua5.1, and the NodeMCU lua implementation for the ESP microcontrollers, and so I did a lot of polyfills – not that I ever got around to testing on NodeMCU, so that code may be completely broken.)

                        1. 1

                          to use so much C++ when a much more pleasant language is just

                          Lua was primarily designed as an embedded scripting language. There is an excellent API to embed the VM into a host application and access its objects from the script. LuaJIT adds another powerful dimension to this with the FFI. From my point of view it is therefore logical to build the application architecture on this. I usually use C++ and Qt for my applications. For the Lua variant of the VM I replaced the interpreter and ObjectMemory class, the actual core of the Smalltalk VM, with Lua code. The window the Smalltalk VM uses is still C++ (and of course the IDE, if it is opened) and the parser for the VirtualImage. So most of the code is in Lua. In the meantime I even migrated the BitBlt to Lua, which has only a small effect on the execution time.

                          Lua is a remarkable language, but in my opinion, like any dynamic, weakly typed language, it has clear limits. In Interpreter.lua you can find some hints what I missed most about the language.

                      2. 2

                        Also, check out their OberonSystem which was compiled with this Oberon-07 compiler.

                        1. 2

                          This one is especially interesting because it’s compiling a language that supports pointers onto a runtime built for a language which does not have pointers.)

                          I’m not as familiar with Oberon as I’d like to be, but does it actually support pointer arithmetic? If not, then I’d imagine it makes it a lot easier to implement on top of lua.

                          1. 1

                            Good point! It’s hard to find details on the web, but according to “Oberon-2 Programming for Windows”, arithmetic isn’t allowed:

                            From now on we will use the term “pointer” for variables as used in the assignment temp := p1. But it should be noted that no arithmetic operations are possible with variables of type POINTER. The only operations allowed are to check for equality, inequality, or NIL and to assign them.

                            https://books.google.com/books?id=YsGoCAAAQBAJ&pg=PA127&lpg=PA127&dq=oberon+pointer+arithmetic&source=bl&ots=J64QT6kS-l&sig=ACfU3U0_o2QgJnStCizLGmCl93F-bj5RQA&hl=en&sa=X&ved=2ahUKEwjGr97O4b7qAhWjNn0KHeBpC80Q6AEwCnoECAgQAQ#v=onepage&q&f=false

                            1. 1

                              It’s hard to find details on the web

                              Here are some interesting documents by the language inventor: https://people.inf.ethz.ch/wirth/Oberon/. There is a specification and a tutorial. And here is a free compy of a textbook: https://people.inf.ethz.ch/wirth/ProgInOberonWR.pdf

                              1. 1

                                Thanks. The people.inf.ethz.ch site was not resolving earlier this week when I went to look things up, but it’s back now. Seems like not a very good sign when the only non-wikipedia source of information about a language can disappear like that though. It’s a shame because Oberon looks much more pleasant for low-level coding than C or C++.

                                1. 2

                                  It’s an interesting language but it is also quite limited which is of course an advantage for compiler writers. Keywords in capital letters only are not everyone’s cup of tea. Personally, I miss generic types too.

                            2. 1

                              It supports pointer arithmetic, but not in the regular language. If you want to do pointer arithmetic, you have to import module SYSTEM which offers some facilities for low-level programing. But my compiler doesn’t support most of them because they cannot be implemented on top of LuaJIT.

                            3. 1

                              it’s compiling a language that supports pointers onto a runtime built for a language which does not have pointers

                              Pointers were actually not that difficult to implement, because Lua has a similar concept in that a variable can reference a table, which is similar to a pointer. Trickier was call by reference which is not natively supported by Lua. I wrote an article about this which you can find here: https://medium.com/@rochus.keller/implementing-call-by-reference-and-call-by-name-in-lua-47b9d1003cc2

                              Very impressive numbers.

                              This is primarily thanks to LuaJIT. I made a few more changes and achieved even better results. But the bottleneck seems to be the drawing pipeline and thread synchronization of Qt.