1. 8
  1.  

  2. 7

    I did not watch the talk, however I support Makefile’s. Makefile’s have a lot of problems with them, however, they are agnostic as to what they act on, they support parallelism if you create them right, and they interact well with other things. Many languages have language-specific build systems and it almost always is easy in the basic case but calls apart under any serious usage. In the end, I usually just want a build system to produce a Makefile for me so I can run it. This is what a build system I’ve built for Ocaml does. While it’s Ocaml specific, it just produces a Makefile and you can connect it to other Makefile’s easily to build non-Ocaml tooling as well. The Ocaml-specifics are just in how to build the Makefile, not how to build the entire project.

    1. 5

      Makefiles are boring the way goto is boring. They’re simple in some sense, but they allow all kinds of unstructured crap in the build definition. They can call any random program, so good luck running them in a secured environment or getting reproducible builds out of them, yet alone having your IDE or CI system understand them. The only build system I’ve seen that gets it right is maven.

      1. 7

        Can you expand on maven? My only experiences with it have been abysmal. Specifically, it conflates a package manager and a build system. It also quickly falls apart if you have 2 dependencies which depend on different versions of the same third dependency. Most places I’ve seen simply turn checking this off and cross their fingures. This, to me, is a pretty obvious consequences of mixing building and package management and containing dependency information in the source.

        1. 5

          I’m not sure what distinction you’re drawing between package management and build system - building is inherently coupled to dependency management.

          The Java classpath only allows a single instance of each class, pre-9, unless you want to futz with classloaders (i.e. OSGI). Given this constraint Maven does the best thing possible: it refuses to add more than one version of the same library to the classpath (a recipe for hard-to-diagnose failures as you end up mixing code from multiple versions), it allows packages to specify a range for their dependencies in which case it errors out if the ranges don’t overlap, or if your dependencies only gave a point version then it will choose one in a best-effort, deterministic way.

          1. 5

            building is inherently coupled to dependency management.

            I disagree. How to get, unpack, install, and find dependencies is an orthogonal problem to how to build an artifact. Language specific dependency managers are very popular and I have never had a successful experience with one. Package manage is something that should be elevated to the language-agnostic layer.

            Can you explain how package management is coupled with building?

            1. 3

              Package manage is something that should be elevated to the language-agnostic layer.

              That’s the thing I have never had a successful experience with!

              Without integration, we would basically end up in C/C++ times again.

              In many cases the way I want to build has a direct impact on dependencies.

              For instance, imagine I want to use the same libraries, but want to cross-build my application for a different target. It’s really really necessary to have an automated, standard way to do that, because my dependencies might have dependencies too, and I don’t want to spend a day chasing random differences to compile my JVM application to JS.

              Another example: In my build tool, I can change language versions on the fly. It’s necessary that my build tool knows how to resolve dependencies, because otherwise the whole workflow will be broken.

              1. 2

                If people would have spent the equivalent time working on a language-agnostic package manager as they have on ones for particular languages, I don’t think we would have this problem. All of the things you have mention can be solved in a package manager. Nix makes most of this quite easy, actually. Switching compielr versions is trivial, just run a new nix-shell to setup your environment.

                So yes, you’re right in a sense, doing intersting things at the package manager has historically been crap. You’re also wrong, if we actually fixed it it would be good. But instead I have to fight a package manager + build system for every language I work with. None of them solve the problem of dependency management or building particularly well and all of them have severe problems because they just repeat previous mistakes rather than learn from the past.

                1. 2

                  Nix is actually a good example of how most build systems have to worry about package management, but package managers aren’t necessarily build systems. Nix does invoke builds, but it does so by setting up the environment and shelling to a makefile or other build system.

                  Certainly not all build systems are package managers also. Cabal, npm, and whatever Ruby’s thing is called all do both, but makefiles don’t do any form of package management; whoever is compiling the package is responsible for having installed the dependencies first. And makefiles are still very widely used for languages that don’t have their own alternatives.

                  Bazel is also an outlier in that it doesn’t do package management, but that’s because its preferred modality is really that every conceivable dependency is already part of your repository. :)

                  The Nix tools could certainly be extended to have a feature that lets package authors specify source files within a package, topologically sort them, and check timestamps or hashes to rebuild only the necessary ones. But they don’t do that at present. So they’re a package manager but not a build system.

                  I’m an enormous fan of Nix as the future of package management. I’ve been low-key asking about whether anyone else also feels that it would be a good idea for it to become a build system, but I’ve never really been met with any understanding of the difference.

                  1. 3

                    I think, unfortunately, these two things have been intertwined for no particular reason and most people are so used to seeing them together that they don’t even consider that maye they don’t need to be braided together. I think, also, out of laziness. Once you decide you want a build system for your favorite language you quickly realize you also need a way to get the deps there and you just try to solve both at once. I think Ocaml, sort of, got this right by accident. They had the build system there already and finally decided to add a package manager. Mixing the two wasn’t a viable option, so the package manager just does what Nix does and shells out. I would have preferred it if they put their effort into making Nix more portable rather than making opam, but such is life.

                    I think Go has set the whole thing back, as well. Not only does Go do building and dependency management, but it tosses out the last several decades of dependency management knowledge (following moving versions, for example). That is worse than pretty much all of the other language specific package managers I’ve used which, at least, allow pinning.

                    1. 1

                      Yeah, I agree with your explanation.

                      Honestly, what makes me want to reuse Nix is simply that the configuration language has surprisingly decent language-design praxis. :) I don’t think I’d even make it part of the same command-line tool, necessarily. Just another tool that happens to use the same concepts.

                      That’s an interesting perspective on Go, which I’m not qualified to comment on; I’ll try to notice in future. Thank you.

                  2. 1

                    I we had one tool for all languages, everyone would complain that it a) couldn’t capture all the details of their language, b) would be too complicated, because of all the stuff that’s in there due to some random other language’s requirements and c) would be not “opinionated” enough to guarantee a similar setup for similar projects and use-cases.

                    Additionally, how would incremental compilation work? Would the tool ship with the binary compatibility rules of every language built-in?

                    Switching compiler versions is trivial, just run a new nix-shell to setup your environment.

                    But then I have to restart my build tool again (or it needs to constantly check for updates and reparse configuration in the background), thus breaking the workflow.

                    I’m quite happy with SBT. It does everything I need, and it helps me solve special one-off requirements of projects as well. Live would be a bit easier if the Java ecosystem didn’t have two dependency formats, Ivy-style and Maven-style, which are just different enough to be incompatible (despite both of them being an Apache project), but SBT shields you from that insanity pretty well. The support for JavaScript libraries is pretty nice, too, meaning you don’t have to deal with JS’s build-tool-of-the-week anymore.

                    None of them solve the problem of dependency management or building particularly well and all of them have severe problems because they just repeat previous mistakes rather than learn from the past.

                    Well, that’s maybe the case for things like Haskell, Go, Ruby, Python, JavaScript etc. But they are just crap. It’s not necessary to be that bad, as other tools show.

                    1. 1

                      Additionally, how would incremental compilation work? Would the tool ship with the binary compatibility rules of every language built-in?

                      I’m not sure what you mean about this. What I’m suggesting is elevating dependencies to a system-wide package manager, something like Nix. Then you just work on your project with the deps installed. If the situation you are talking about is you are modifying the dep and the thing at the same time, I don’t have a direct solution to that right now (although I don’t think something like Maven has either), however I think that putting effort into solving that rather than every programming language getting its own (or multiple!) package managers of its own is time well spent.

                      So, I don’t have a solution to all the package manager situations you can think of, but I also haven’t thought very long about it, and I don’t see anything that is not solvable via some thought.

                      I we had one tool for all languages, everyone would complain that it …

                      Such is life, indeed. An unfortunate result of software being so soft.

                      1. 1

                        Making a change shouldn’t involve two separate tools, just because one was in the build definition and one was in a source file.

                        Otherwise this gets messy very fast when you have multiple clients building things (IDE, tests, linter).

                        1. 1

                          But it shouldn’t require two separate tools. Things that are decoupled can always be braided together in higher up tools. The problem is things are braided together at the bottom which means you’re stuck with them no matter how good or bad that braiding is. I’m proposing we separate them into their orthogonal parts and let people build on top of that. As long as I know the bottom pieces I can use those, but you can use the top pieces if they make your life better.

                          1. 1

                            Things that are decoupled can always be braided together in higher up tools.

                            I think the amount of communication necessary between these tools causes more friction and work than it helps solving the problem.

                            For instance, if I change a version or a compilation target somewhere, I want the tool to make the changes necessary to reflect that, but not until the compilation that’s currently running has finished. In the end, you need one tool which can serialize these change requests. There needs to be one tool as a single point of truth in your system.

                            The problem I see is not the lack of indirection, it’s the lack of people writing good tooling. Adding more indirection won’t solve that.

                            It’s the same issue when you are working with a VCS: You want to let the compiler run on the changes in branch A you just finished, but you also want to start working on branch B. If the build tool knew about the VCS, it could deal with all of that automatically, and even manage compilation in a way so that when you start working on branch C, branch B is only compiled after branch A finished compiling (for performance).

                            1. 1

                              I think the amount of communication necessary between these tools causes more friction and work than it helps solving the problem.

                              I don’t see any reason for that to be true. In particular, the current setup is a burden on tool authors because if you want to support multiple languages you have to support multiple package management and build tools. Moving one or both of those to a higher level means the IDE needs to understand fewer things.

                              but not until the compilation that’s currently running has finished. In the end, you need one tool which can serialize these change requests. There needs to be one tool as a single point of truth in your system.

                              Why don’t you do just do these things in parallel? Have an environment for each way you want to build stuff. Nix can have mutiple environments existing side-by-side, it’s the default way to use the system.

                              The problem I see is not the lack of indirection, it’s the lack of people writing good tooling. Adding more indirection won’t solve that.

                              I’m not requirement more indirection, I’m suggesting building tooling around the actual problem something needs to solve then optionally combine them later. I have never seen a problem that did not benefit from separation of concerns.

                              1. 1

                                Why don’t you do just do these things in parallel? Have an environment for each way you want to build stuff.

                                Sure, that’s the default. I was more thinking about interactive changes, while other work is happening. Substitute anything else if you want.

                                1. 1

                                  Either I’m not understanding you or you aren’t explaining what you mean well, but I just don’t understand what the issue is you’re referring to. How is the language-specific-package-and-build-system solution any better than elevating it all to a system wide one that does the same thing, but for everything.

                                  1. 1

                                    Because it doesn’t work in practice.

                                    I wished that this wasn’t the case, but maybe things look differently in 20 years from now.

                    2. 1

                      I agree with much of this. At one point I even wrote a maven plugin to build Python with. I think one issue is that it’s hard to persuade a language ecosystem to use a build tool written in a different language (for understandable reasons).

                      Another is that a good build system necessarily offers a VM-like level of isolation for builds, and it’s only in the last few years that things like containerisation have taken off. Even then it’s not clear how to handle this in a language-agnostic manner. E.g. it would be madness to build Java inside a container because the JVM itself already provides a better level of isolation. Whereas Python is nominally a VM but it probably makes sense to build in a container given the idiomatic use of C extensions. But then how do you do that in a cross-platform way?

                      1. 1

                        Another is that a good build system necessarily offers a VM-like level of isolation for builds

                        I disagree, I think a good package manager offers a VM-like level of isolation of packages. This is exactly how Nix and nix-shell works. IDEs can be intregrated into that fairly easily if the IDE authors want, instead of calling the compiler directly, call it with nix-shell in front and you get your different compiler, different deps, cross compiling, yadda yadda. The build process is the same in that case (hopefully), but it’s just a matter of what deps are available and what compiler you are using. There are cases that break this, but for the most part I think those are a case that should probaly be elevated to the package manager level anyways.

                        1. 1

                          IDE integration isn’t just a matter of calling some shell commands. Your IDE knows what edits you’ve made, so it should know what’s upstream of them - shelling out and having the build system scan the mtimes to figure out the information you’ve just thrown away is wasteful. Any non-compilation steps also need to be understood by the IDE, e.g. if you’re using thrift to generate code from IDL then the IDE needs to understand the relationship between IDL files and code; it would be silly to specify this twice.

                          More importantly, you need to be able to run your test suite in the built environment. In my experience any nontrivial project ends up with some runtime code behaviour (services in the java world, I’d assume dlopen in C-land, in python obviously every dependency is resolved at runtime), so the test suite needs the declared dependencies. Which means whatever it is that knows about dependencies also needs to know how to run your test suite, or at least be deeply integrated with the thing that does. (Again, you could have your build system shell out to something that provides the environment in which to run the tests, but given that the calculation of “change to class X -> run tests Y and Z” is the same whether X is in an upstream library or in local code, it makes no sense to duplicate that logic).

                          At that point I think calling it a package manager rather than a build manager is just semantics.

                          1. 1

                            IDE’s can integrate more mightly if they want, I was giving an example of how things can work. But instead of integrating against all of the different package managers for all of the different languages the IDE wants to support, it can integrate against one package manager that understands the whole system. For example, if a pip package depends on a C package, well, good luck. By elevating this problem higher, you get your C package as well.

                            What I don’t understand is why you and others think that it is any less work to for IDE authors, languages authors, and users, if every language has its own package manager. Even if each language has its own build system, it can still offload package manager to the higher level system. Even if the build system has to do run-time things, it can still ensure the system is setup correctly by calling out to the package manager. How is this worse?? I’m not sure what duplicate logic you are referring to either but I suspect we disagree enough that it isn’t really worth discussing.

                            At that point I think calling it a package manager rather than a build manager is just semantics.

                            Yes, I agree, considering semantics is the meaning of things and we’re talking about the meaning of things.

                            1. 1

                              What I don’t understand is why you and others think that it is any less work to for IDE authors, languages authors, and users, if every language has its own package manager. Even if each language has its own build system, it can still offload package manager to the higher level system. Even if the build system has to do run-time things, it can still ensure the system is setup correctly by calling out to the package manager. How is this worse??

                              A package manager/build manager need to be so intimately integrated that they can only be written in the same language, or at least languages with a highly structured common ABI (which in practice means the same VM). Perhaps that’s more an indictment of the current state of cross-language library support than anything else; nevertheless, here we are. If your build system shelled out to an external package manager you’d spend more time marshaling and unmarshaling command lines and reconstructing information that you’d thrown away than actually doing dependency management.

                              You seem to be advocating a framework+plugin style, which I think would be subject to all the problems of application frameworks (even if you could do it cross-language, which doesn’t sound fun) - it’s much better to provide compositional libraries. In theory I could see value in providing some elements of package management as a library that per-language build systems could use. But in practice I’ve found it’s very rarely worth wrapping a library from a different language rather than reimplementing it (or more likely finding an existing implementation) in the language I want to work in.

                              You could equally well ask why languages have their own regex implementations, or their own garbage collectors. Fundamentally writing a new package manager may well be easier than getting an FFI to work nicely.

                  3. 2

                    Package management is a term I’d use to mean something different - maintaining a native OS install.

                    Building a Java project is building a set of classes and packaging them up, fundamentally. If it’s a multi-module project, it might consist of doing this several times, and it means resolving a dependency graph to do so correctly. At that point including some remote, prebuilt modules in the build is no different. It exercises much the same requirements (in terms of resolving dependencies) and has much the same results (in that the end result of the build is dependent on the other classes it was built against, whether those are classes in the same module, a different module that was built as part of the same execution, or a module that was built elsewhere). When I’m making a change to my project, adding or updating a dependency is the same kind of action as adding or editing a source file, and should be handled the same way.

                    1. 1

                      I think you’re making an unnecessary distinction between your OS and a project you’re working on. A package manager can solve both cases and Nix is a great example of something in the right direction. But if you’re happy with the current situation then enjoy :)

                      1. 2

                        Maybe something like maven could be used to manage OS-level packages; I don’t really care about that use case one way or another. I do know that I’d be very unhappy working on an application with a build system and IDE that didn’t intimately understand the project’s library dependencies.

              2. 3

                They can call any random program, so good luck running them in a secured environment or getting reproducible builds out of them, yet alone having your IDE or CI system understand them

                If only there was some sort of common set of programs that could be depended upon to be available in a nix-like operating environment.

                Some sort of portable operating system interface, if you will. :|

                Upon learning that set of tools, it would seem that developers could write build scripts that worked reliably–and moreover, wouldn’t have to keep learning new tools as the fashions changed.

                1. 1

                  That’d be good, yeah. The only serious effort I’m aware of is LSB, which distributions are abandoning; I’m certainly not aware of any projects that test their builds against the LSB test suite as part of the normal build process, and I’ve heard that the test suites are bad in any case. There’s POSIX but AFAIK they only offer the other direction of test suites - there’s no easy way to run your build and certify that it will work on any POSIX-compliant system (or not).

                  Fundamentally I think the unix shell environment is too broad and underspecified an interface to properly test though. You need something small and well-defined (hence the popularity of containers).