1. 30
  1.  

  2. 14

    I’m not sure about this. In the category of “yet another build system nobody has heard of” there are plenty of strong contenders. I use make only because it’s ubiquitous, not because it’s good.

    1. 5

      I hattteeee make. The idea is fine, but a Makefile tends towards disaster. I rarely see one that is properly maintained and commented. There is something about Makefiles that cause them to grow organically way out of control very quickly. And as they are just bash scripts, they’re Turing complete and can do anything.

      You then don’t know which build rule to use because there are three that look similar and no one remembers which does what. Worse, they drift from implementations and just stop working after a refactor and no one knows.

      I strongly believe that this is due to the Makefile format and not developers, because I’ve seen it so many times that it just isn’t isolated to one set of devs.

      I feel like build files that are built on the language you are primarily building for (e.g. Mage for Go) are the right choice. Unit testing is there. Type safety if you can get it. Obvious error checking requirements. Readable function composition. Heck go nuts and write an integration test that uses reflection to run every single build rule in your CI chain to ensure no breakage.

      I’m pretty tired of *nix usability being held back by the requirement of ubiquity. Things like the fish shell is obviously a better choice for most devs, but so many people still just use Bash and POSIX because it’s ubiquitous. I want to see a distribution that sees you are trying to execute a program that doesn’t exist, and whitelist known good packages and just pull them before execution.

      1. 2

        I’m not sure that learning N good build systems is better than learning 1 not-great build system for life.

        Just like bash, it’s really easy to jump right into writing fresh Makefiles. This is a trap for both Bash and Make.

        Anecdotally, I’ve never seen someone study Bash or Make like they would Go or Python. I don’t think Make is mostly to blame. Just like you wouldn’t blame Go if your new coworker writes shitty Go code. Additionally, people don’t spend as much time writing Bash or Make as they do Go or Python, so they also practice less.

        Not taking the time to learn something and hardly ever practicing seems like a bad idea with any build tool. I’m willing to bet someone who doesn’t know Go can write a pretty shitty Mage file.

        Yes, the Make syntax isn’t the sexiest thing out there, but by putting in my time now, I’m betting that I’ll know something relevant 5, 10, 50 years from now. I don’t want to learn a bunch of different build systems for different projects every few years.

        1. 2

          This! Most of the times we deal with machines that are under our control. If we can’t have tools that make us productive there, we’re losing so much. For example, if we need to sporadically inspect/debug our fleet of machines, wouldn’t it be better to have already all frequently used tools there? Not just for debugging, but for all the things that make us effective? Things that are installed by default are there either to satisfy POSIX or because distro maintainers thought the tools are needed. However, much of that works for the averages, and many uses fall outside of the comfort of two sigma. One of the reasons I started liking Nix is exactly because it is so simple to describe the composition of tools.

          1. 1

            The power of redo is the fact that the build script can be in any language you want. Anything that supports #! can be a redo build script.

          2. 3

            i use make solely for its ubiquity and familiarity, because those are the good parts of it. my makefiles are typically calls to the build system i’m actually using, because it might be easier and better to configure my project’s build using some other build tool, but it’s nice to just be able to type make to kick things off.

            1. 3

              Well, this version of redo has a pure shell fallback that you can just include in your project.

              1. 1

                That’s similar to why I use autotools. They’re the devil everybody knows how to deal with and carries no dependencies other than a POSIX environment at a user’s build time (unlike, say, CMake, which also wants itself to be present at build time; at least it’s slowly becoming more ubiquitous, but I’m not changing my stance until it’s in Debian build-essential).

              2. 4

                Back in 2010, shortly after this post was published, I worked with a friend to convert several non-trivial open source projects to using redo. Our biggest frustration was the 1:1 relationship between source and generated files. This doesn’t play well with Java.

                A great follow-up to this post is Build Systems à la Carte, previously discussed here. In it, the last several decades of build systems are summarised and their function formalised.

                1. 2

                  This is kind of interesting, and I see how something better than make but equally simple (so not something which generates makefiles like cmake/autotools/meson) would be useful. However, I really don’t see myself using something where each target has to be its own file in the top level. It looks like it would clutter up a repo, and would be kind of annoying to write, but more importantly, getting a holistic view of all the different targets and how they are implemented and how they interact seems like it would be harder.

                  This seems like it might be sacrificing a bit too much usability in the name of a simple and clean implementation.

                  Also, a tool should probably have some knowledge of how C (or other target languages) works; otherwise, we end up with the current make situation, where there’s a bunch of people who write bad hand-rolled makefiles, a few people who write hand-rolled makefiles which actually do all the things a makefile needs to do, and a bunch of people who introduce a ton of complexity by using makefile generators. I don’t want to manually implement all the intricacies involved in dependency resolution, installation and uninstallation, linking, making my library usable from other code bases, etc. every time.

                  Another thing; the documentation’s example .do files contain lines like redo-ifchange $2.c - that would break if paths ever have spaces in them. Not that you would generally want spaces in your paths, but fixing that limitation seems like it would be one of the easy wins when trying to replace make.

                  1. 1

                    Cmake/Autotools/Meson are for build configuration and not the actual build. Make is only for the build itself. With pure Make you do the configuration part manually.

                    1. 1

                      I don’t really understand your point. With a well-written pure Makefile, files are compiled by $(CC) and/or $(CXX), .pc files are found with $(PKG_CONFIG), CFLAGS, CXXFLAGS and LDFLAGS are respected, make install puts stuff in $(DESTDIR)$(PREFIX), etc, so “configuring” such a project generally amounts to setting the appropriate environment variables (or adding the appropriate variable definitions to make’s argv).

                      Redo is obviously trying to appeal to the use case where you write your Makefile or redo files by hand; otherwise it’d be describing why it’s nicer to auto-generate redo files than it is to generate Makefiles and ninja files.

                      1. 1

                        You are looking for something simple and kind of define “simple” as leaving out the configuration part to the user.

                        It is not fair to denounce CMake/Autotools/Meson and then ignore the problem they solve. Instead you focus on the problem they to do explicitly not solve but delegate to another tool like Make or Ninja.

                        Lucky you, if your project only relies on a few environment variables for configuration. Other projects need to link different libraries or compile different files for example.

                        1. 2

                          I’m not really denouncing CMake/Autotools/Meson; what they do is good, if somewhat unnecessary for small projects where a simple Makefile is enough. They’re just not really relevant when discussing redo, because redo obviously isn’t competing against cmake, nor is it competing against auto-generated ninja files or Makefiles. It’s competing against hand-written Makefiles for use cases where hand-written Makefiles are suitable.

                  2. 2

                    This is a truly great tool.

                    1. 1

                      For reference, that implementation of redo is at least vaguely maintained, with the latest commits to master on github 4 months ago.

                      1. 1

                        Everybody discussing build systems should have read Build Systems à la Carte first. The paper comes with its own build system “Shake” and they write:

                        Redo [Bernstein 2003][Grosskurth 2007][Pennarun 2012] almost exactly matches Shake at the level of detail given here, differing only on aspects like polymorphic dependencies §6.7.

                        1. 0

                          This is much simpler than, but has heavy conceptual overlap with, Google Blaze/Bazel. I’m wondering how much the latter was influenced by this.

                          1. 1

                            The two features that set Bagel apart are hermetic and distributed builds. Redo provides neither.

                            Redo has some overlap due to the fact that does not need to parse the full build configuration but only does it on demand.