1. 10
    1. 2

      Note: I do not mean to discount this research. I am asking why we don’t have this yet.

      If you write a lexer in a functional style, and you write a parser in at least a little bit of a functional style, can you get this for free? This is all a bit handwavy, but I thought compilers like GHC and maybe others could do stream fusion or loop fusion or something like that. It would be really nice to automatically be able to skip all the overhead in lexing/parsing, or operating over compressed data (reading structures out of a zip file, for example), or …

      1. 2

        We propose a deterministic variant of Greibach Normal Form that ensures deterministic parsing with a single token of lookahead and makes fusion strikingly simple

        I don’t expect a generic compiler optimization to produce an effect as specific as what’s in the paper. You generally need to write code in a specific form (a DSL), and use optimizations specific to the DSL.

        In this case, the lexer and parser must be converted to DGNF, a format they just invented, before their new optimization algorithm can do its work.

        1. 1

          In particular, they wouldn’t have seen 2-7x speed improvements like they did, if the compiler was already applying their optimization.

      2. [Comment removed by author]