1. 10

    There are numerous packages on Hackage which provide Big O annotations - for example see the Data.Map.Strict module (persistent maps.)

    I find it useful as both a rough baseline for my own code as well as a signalling mechanism that the author has thought through some of the higher level API.

    Unfortunately it’s not completely pervasive and Haskell documentation tends to vary wildly in quality, but alot of the more popular base-level libraries do have these annotations.

    1. 2

      This is very well written and is worth reading for the section about prior work alone. I would have liked a closer look at LambdaPi since I am not familiar with it.

      1. 3
        1. Henk: A Typed Intermediate Language by Erik Meijer and Simon Peyton Jones contains a tutorial covering the Lambda Cube and can serve as a brief introduction to Pure Type Systems.
        2. Lambda Pi: A Tutorial Implementation of a Dependently Typed Lambda Calculus by Andres Löh, Conor McBride and Wouter Swierstra is a worked introduction to implementing a dependently-typed lambda calculus, and has the resulting implementation available in Haskell which can be useful to follow along.

        I’d recommend reading those in order. The Henk paper in particular is great, and despite being from 1997 I haven’t personally found any better introduction to the topic.

        (That is, I recommend reading Henk before the actual link regarding LambdaPi or the reorganisation.)

      1. 10

        But in most large projects you will just write a script to generate these files.

        When I read statements like this, I think of the underpants gnomes business plan wherein one just glosses over the “magic” phase.

        1. 7

          It’s not unusual at all. The de facto architecture of build systems is now to separate them into high level and low level layers.

          This started with autotools and Makefiles. Autoconf and automake are separate high level languages (implemented as m4 macros, blech), which generate shell scripts and Makefiles. GNU Make does the actual execution.

          This architecture has become more codified with Ninja as the low level and GN (Chromium) or CMake (LLVM) as the high level, portable version.

          So he’s basically saying that Fac is a low level build system. It doesn’t have Turing complete features. Lower level languages may lack Turing complete features like Ninja, but higher level build languages need them. Android is moving away from 200K lines of Makefiles that included the GNU Make Standard library, so it’s a build system bigger than most programs people write. The high level language needs to be Turing complete, which isn’t a small undertaking. CMake admitted that their language is kinda horrible.

          1. 3

            I don’t dispute the separation. But the wording (“just write a script”) belies the amount of work needed to make it into a full build system that you can actually use.

            1. 3

              I realise I’m probably being overly pedantic but every time I see arguments about the need (or not) of a language to be Turing complete I wonder if people would be better served by offering a more precise specification of the feature(s) they believe this will enable. For example, the blanket statement of whether something is Turing complete only communicates the ability to simulate a Turing machine, there are very many valid programs that can be written without that particular feature.

              Agda et al. is an example of something that is not Turing complete per-say (Partiality monad aside), yet it would not be mistaken for being in the same power-to-weight category as a Makefile.

              1. 2

                Yeah in this context “Turing complete” is perhaps not very precise or doesn’t capture the whole issue. In particular people often conflate two things: whether you can do arbitrary computation and whether you have side effects.

                Both are design issues in a build system. A side effect would be reading the source files to discover dependencies from .h files, rather than the dependency graph being purely a function of what’s in the Makefile.

                For the case of builds, my assertion is that you need arbitrary computation: memory, loops, and conditionals. And although it doesn’t affect whether it’s Turing complete or not, you need abstraction power like functions and so forth.

                If there are better terms to have this discussion with, I would be interested in hearing them. But my overall point stands: low level build systems deal with fine-grained dependency graphs and parallel execution. High level build systems have “logic” for portability and code reuse. Build systems are real code now.

          1. 42

            My initial unfounded reaction is I’d rather not see such threads here if only because it has the potential to attract attention from tech recruitment circles.

            1. 8

              Ditto. Rather not.

              Edit: Having a #jobs channel has created some problems for a Slack community I’m in. Really attracts people that ignore or don’t care about group norms. Will play dumb when you call them on it.

              1. 1

                I kind of like watching them slink away with increasingly frail excuses for being a recruiter with no interest in the community.

                1. 2

                  I don’t enjoy humiliating people.

                  1. 2

                    Hm, I haven’t seen any humiliation. Usually just a question about how their offering is relevant to the community and an unrelated, dodging answer offered. Maybe I’ve just missed the ones where things got more aggressive.