1. 11
  1. 3

    See also Saarland University’s Compiler Design Lab’s prior work, AnySL. SL is for Shading Language. Naming connection is obvious.

    1. 3

      Ah, and Leißa’s masters’ thesis was on automatic SIMD codegen http://www.cdl.uni-saarland.de/publications/theses/leissa_dipl.pdf

      1. 1

        I didn’t see that one, and it explains why AnyDSL has renderers as examples and GPU support from the get go. Auto vectorization is a though technical issue, but AnyDSL seems to put the emphasis on partial evaluation (aka. compile-time evaluation) and collapsing the layers of abstraction.

      2. 3

        The language looked a lot like Rust at first. It turns out the “Impala” tab on the sidebar refers to this language, which has Rust syntax mixed with functional flavors, like sane function types (as in f: fn(int) -> int) as well as @ added for partial evaluation (as in let fxy = @f(x, y)).

        I was pretty confused at first since I mixed up partial application and partial evaluation, which are two different things. Partial application means applying some parameters to a function to get a new function closed over those parameters. Partial evaluation, which I never heard of before, is a compiler optimization that precomputes all known compile-time input at the call site. It’s more than just inlining – it specializes the code of the function based on static input, which you can imagine a programmer doing if they inlined code by hand. For example, factorial(5) becomes 5*4*3*2*1, which, as you might notice, stripped not only the recursive calls but also the base case check.

        Although it said “More advanced examples can be found in the AnyDSL GitHub organization.”, I wasn’t able to find many examples except for this

        1. 2

          I also found the Partial Evaluation a bit mysterious, but it’s actually presented clearly in the AnyDSL paper: “[Partial Evaluation] partitions the input of the program and its execution into two stages. In the first stage, the program is partially evaluated on the static inputs (algorithmic variants, parameter values, target-machine dependent code) to produce the residuum. In the second stage, the residuum is fully evaluated on the actual, so-called dynamic inputs to produce the actual outputs.”. Zig, I think, has a similar concept where expressions can be evaluated and expanded at compile time, but Impala (and Thorin) go quite far in producing a minimal residuum.

          If you haven’t looked at them, here are the advanced examples that the docs were most probably referring to: Rodent, Stincilla and Ratrace.

          1. 1

            Really nice explanation of the explanation! My learning is accelerating.

          2. 1

            Impala looks like OCaml + Lua + Rust, aka wunderbar.