1. 35

I just started a public document for this question. Nevertheless, contributing to a repo is a hassle compared to droping a comment here, so here is your chance to rant about your most hated build system and I will do the work to fit it into my document.

  1.  

  2. 22

    Virtually all of them

    Dumping stuff in $HOME. Really?

    There is a dedicated folder, .cache (or $XDG_CACHE_DIR), and pretty much all tools chose to ignore it:

    Maven, Ivy, Cabal, SBT, Cargo, Gem, Bundler (congrats on adding two dirs to my $HOME), …

    It’s come to the point where half of my dot folders are things dumped why various build tools.

    As a consequence, I have made my $HOME dir read-only, and aggressively file bugs against any applications that fail running.

    1. 3

      Cargo is working on this! I agree that it’s annoying that they don’t follow XDG. However, I don’t think that XDG is really much of a thing outside of freedesktop-esque operating systems - pretty much anything that isn’t Linux or Mach/Hurd. Obviously Windows doesn’t fall into this, but I can’t comment on macOS or any of the BSDs because I don’t use them.

      1. 3

        The point is to follow the applicable standards of the respective operating system, not roll your own thing that is wrong on all of them.

        There are well-published standards for Windows, macOS and Linux that need to be followed by library and application authors.

        Heck, I wrote a Rust library that provides the right paths to libraries/applications in a day or two. It’s not hard, developers just have to care about it and respect their users. Sadly, more often than not, their program is the most important thing, special, and doesn’t need to follow any platform rules.

      2. 2

        I think the java tools are older than the XDG spec, so they cannot be blamed for not respecting something that was invented later. Now they have to be backwards compatible of course..

        1. 3

          They can be blamed at the point when XDG came into existence, and much earlier for their lack of respect of Windows and macOS rules.

          It’s not hard to implement a solution that seamlessly migrates to the standard without disturbing existing usages.

        2. 1

          ls -a ~/ on my work machine = instant anxiety pangs. a terrible practice that unfortunately seems to be sustained out of pure habit.

          1. 2

            That’s why I made it read-only. There will never be more dot files in my $HOME than I have now, only fewer.

            This month I deleted .nano and .coursier. Another two gone. :-)

          1. 7

            This made me full of blood rage when I happened upon it a few months ago. I just want to have some well-named markdown files be automatically converted to HTML and also concatenated, converted to HTML, and that HTML converted to PDF. All with good-looking file names with spaces in them. Like how normal people named files.

          2. 30

            All of them:

            The fact that they exist at all. The build spec should be part of the language, so you get a real programming language and anyone with a compiler can build any library.

            All of them:

            The fact that they waste so much effort on incremental builds when the compilers should really be so fast that you don’t need them. You should never have to make clean because it miscompiled, and the easiest way to achieve that is to build everything every time. But our compilers are way too slow for that.

            Virtually all of them:

            The build systems that do incremental builds almost universally get them wrong.

            If I start on branch A, check out branch B, then switch back to branch A, none of my files have changed, so none of them should be rebuilt. Most build systems look at file modified times and rebuild half the codebase at this point.

            Codebases easily fit in RAM and we have hash functions that can saturate memory bandwidth, just hash everything and use that figure out what needs rebuilding. Hash all the headers and source files, all the command line arguments, compiler binaries, everything. It takes less than 1 second.

            Virtually all of them:

            Making me write a build spec in something that isn’t a normal good programming language. The build logic for my game looks like this:

            if we're on Windows, build the server and all the libraries it needs
            if we're on OpenBSD, don't build anything else
            build the game and all the libraries it needs
            if this is a release build, exit
            build experimental binaries and the asset compiler
            if this PC has the release signing key, build the sign tool
            

            with debug/asan/optdebug/release builds all going in separate folders. Most build systems need insane contortions to express something like that, if they can do it at all,

            My build system is a Lua script that outputs a Makefile (and could easily output a ninja/vcxproj/etc). The control flow looks exactly like what I just described.

            1. 15

              The fact that they exist at all. The build spec should be part of the language, so you get a real programming language and anyone with a compiler can build any library.

              I disagree. Making the build system part of the language takes away too much flexibility. Consider the build systems in XCode, plain Makefiles, CMake, MSVC++, etc. Which one is the correct one to standardize on? None of them because they’re all targeting different use cases.

              Keeping the build system separate also decouples it from the language, and allows projects using multiple languages to be built with a single build system. It also allows the build system to be swapped out for a better one.

              Codebases easily fit in RAM …

              Yours might, but many don’t and even if most do now, there’s a very good chance they didn’t when the projects started years and years ago.

              Making me write a build spec in something that isn’t a normal good programming language.

              It depends on what you mean by “normal good programming language”. Scons uses Python, and there’s nothing stopping you from using it. I personally don’t mind the syntax of Makefiles, but it really boils down to personal preference.

              1. 2

                Minor comment is that the codebase doesn’t need to fit into ram for you to hash it. You only need to store the current state of the hash function and can handle files X bytes at a time.

              2. 14

                When I looked at this thread, I promised myself “don’t talk about Nix” but here I am, talking about Nix.

                Nix puts no effort in to incremental builds. In fact, it doesn’t support them at all! Nix uses the hashing mechanism you described and a not terrible language to describe build steps.

                1. 11

                  The build spec should be part of the language, so you get a real programming language and anyone with a compiler can build any library.

                  I’m not sure if I would agree with this. Wouldn’t it just make compilers more complex, bigger and error prone (“anti-unix”, if one may)? I mean, in some cases I do appriciate it, like with go’s model of go build, go get, go fmt, … but I wouldn’t mind if I had to use a build system either. My main issue is the apparent nonstandard-ness between for example go’s build system and rust’s via cargo (it might be similar, I haven’t really ever used rust). I would want to be able to expect similar, if not the same structure, for the same commands, but this isn’t necessarily given if every compiler reimplements the same stuff all over again.

                  Who knows, maybe you’re right and the actual goal should be create a common compiler system, that interfaces to particular language definitions (isn’t LLVM something like this?), so that one can type compile prog.go, compile prog.c and compile prog.rs and know to expect the same structure. Would certainly make it easier to create new languages…

                  1. 2

                    I can’t say what the parent meant, but my thought is that a blessed way to lay things out and build should ship with the primary tooling for the language, but should be implemented and designed with extensibility/reusability in mind, so that you can build new tools on top of it.

                    The idea that compilation shouldn’t be a special snowflake process for each language is also good. It’s a big problem space, and there may well not be one solution that works for every language (compare javascript to just about anything else out there), but the amount of duplication is staggering.

                    1. 1

                      Considering how big compilers/stdlibs are already, adding a build system on top would not make that much of a difference.

                      The big win is that you can download any piece of software and build it, or download a library and just add it to your codebase. Compare with C/C++ where adding a library is often more difficult than writing the code yourself, because you have to figure out their (often insane) build system and integrate it with your own, or figure it out then ditch it and replace it with yours

                    2. 8

                      +1 to all of these, but especially the point about the annoyance of having to learn and use another, usually ad-hoc programming language, to define the build system. That’s the thing I dislike the most about things like CMake: anything even mildly complex ends up becoming a disaster of having to deal with the messy, poorly-documented CMake language.

                      1. 3

                        Incremental build support goes hand in hand with things like caching type information, extremely useful for IDE support.

                        I still think we can get way better at speeding up compilation times (even if there’s always the edge cases), but incremental builds are a decent target to making compilation a bit more durable in my opinion.

                        Function hashing is also just part of the story, since you have things like inlining in C and languages like Python allow for order-dependent behavior that goes beyond code equality. Though I really think we can do way better on this point.

                        A bit ironically, a sort of unified incremental build protocol would let compilers avoid incremental builds and allow for build systems to handle it instead.

                        1. 2

                          I have been compiling Chromium a lot lately. That’s 77000 mostly C++ (and a few C) files. I can’t imagine going through all those files and hashing them would be fast. Recompiling everything any time anything changes would probably also be way too slow, even if Clang was fast and didn’t compile three files per second average.

                          1. 4

                            Hashing file contents should be disk-io-bound; a couple of seconds, at most.

                            1. 3

                              You could always do a hybrid approach: do the hash check only for files that have a more-recent modified timestamp.

                            2. 1

                              Do you use xmake or something else? It definitely has a lot of these if cascades.

                              1. 1

                                It’s a plain Lua script that does host detection and converts lines like bin( "asdf", { "obj1", "obj2", ... }, { "lib1", "lib2", ... } ) into make rules.

                              2. 1

                                Codebases easily fit in RAM and we have hash functions that can saturate memory bandwidth, just hash everything and use that figure out what needs rebuilding. Hash all the headers and source files, all the command line arguments, compiler binaries, everything. It takes less than 1 second.

                                Unless your build system is a daemon, it’d have to traverse the entire tree and hash every relevant file on every build. Coming back to a non-trivial codebase after the kernel stopped caching files in your codebase will waste a lot of file reads, which are typically slow on an HDD. Assuming everything is on an SSD is questionable.

                              3. 9

                                Things I don’t like about Nix:

                                • People find it scary, and I would like to fix that.
                                • It has no incremental build support per-project. Chromium fail to build in the last step? Sorry, hav to start over.
                                • The command line interface is obtuse and hard to understand. Some commands work very differently from others , and leads to extremely confusing behavior. Hopefully this will be fixed in 1.12.
                                • The evaluator is a bit slow and memory inefficient, causing corner cases like checking every package description across every architecture requires too much RAM and CPU time.

                                Almost every other build tool:

                                • Undeclared. Dependencies.
                                • Improperly pinned dependencies without hashes, making it hard to know if 1.0.0 you got today is the same 1.0.0 you got yesterday (hint! it isn’t always the same!)
                                1. 4

                                  The language badly needs a type system, and the cli tools are terrible. But it’s by far the best build/configuration management system I’ve ever seen.

                                  1. 2

                                    Undeclared. Dependencies.

                                    Could you elaborate? The problem that you easily forget dependencies in e.g. Makefiles? The problem that transitive dependencies are not specified properly in e.g. npm?

                                    1. 9

                                      Given he’s contrasting to Nix, I assume he’s talking about (eg) ‘you need libxml2 installed systemwide for this to build’ not being specified in a machine-readable way.

                                    2. 2

                                      I found it really hard to understand even though I spent many, many days reading docs/community discussions and contributed many PRs and fixes. I still don’t really understand how nix works very well lol.

                                    3. 6

                                      Here is a story about time spent in the compiler versus in build systems: the OCaml community (functional language with native code compiler) struggled a lot with build systems and moved from Makefiles, OCamlbuild, Oasis to now jbuilder. The compile time in big projects dropped often by a factor of 10 and more by moving to jbuilder. This shows how much time was spent in the build system versus the compiler, which is very fast. Build systems tend to put layers upon layers which often can’t parallelise tasks or repeat steps (like dependency analysis) that have been done before.

                                      1. 2

                                        I am also mostly migrating to jbuilder which I enjoy quite a bit if only for how easy it is to make libraries, but I think one of the reasons why it is fast is because it is deliberately rather unflexible. You can extend ocamlbuild to do a lot of custom things in your project, whereas doing this with jbuilder is difficult. Including oasis in this list strikes me a bit odd, since while oasis in theory supports multiple build systems, the only one implemented is ocamlbuild.

                                        Of course there is the question on how complex builds a build system should support. Nowadays I am thinking that an opinionated build system is actually quite useful. In Clojure-land I quite liked that leiningen just worked because there was only so many things it supported, whereas the complete freedom of calling ocamlc manually gave rise to at least half a dozen build systems which do everything differently because nobody really agrees on how to structure projects.

                                        1. 1

                                          Initially I also did not see the point of using Oasis and only used ocamlbuild directly. This changed when I noticed how difficult it is to compile C bindings with ocamlbuild. Oasis takes care of creating the dreaded myocamlbuild.ml, which is required in more complicated projects.

                                      2. 5

                                        CMake has a hidden command line argument cache that breaks user expectations like this:

                                        $ cmake -DFOO=ON
                                        $ cmake -DBAR=ON # Surprise! It actually works like "cmake -DFOO=ON -DBAR=ON".
                                        # what you should be doing instead:
                                        $ rm -f CMakeCache.txt; cmake -DBAR=ON
                                        
                                        1. 5

                                          Li Haoyi has a fantastic piece that’s simultaneously best explanation of how Scala’s de facto official build tool SBT works and also a great description of its fundamental problems: http://www.lihaoyi.com/post/SowhatswrongwithSBT.html

                                          I share most of his frustrations, both about the fundamental design issues and the incidental issues the plague users. However, I think most build systems are pretty dreadful. The only one I’ve used that I’ve been really enthusiastic about is Nix, but it still has many incidental issues that should be addressed.

                                          In general, I think any build tool that thinks of things in terms of a mutable directory full of files that need to be poked and prodded with tools in the correct order is not the way, fundamentally, we should be thinking about things. Thinking of things in terms of a chain of pure functions could make these things conceptually simpler, faster, and more reliable.

                                          1. 4

                                            node.js/npm

                                            • Dependency categorization is a bad abstraction. Just prod and dev, but no build-only, test-only, debug-only, or CI-only dependencies.
                                            • Artifacts are not portable because native add-ons are so common. Arguably this is a gripe about node.js the platform itself but one of the reasons a team may choose a scripting language is so they DON’T have to deal with C-level library linking errors and incompatibilities.
                                            • npm packages are bloated by default with docs, tests, random dev-only files when uploaded to the registry

                                            All of them

                                            • Even after how many decades, AFAIK there’s not really any cross-platform tool that can be relied upon to be pre-installed on any developer machine, even if it’s purpose is solely to facilitate installation of a language-specific and update-to-date build tool. On linux and mac it’s pretty safe to assume usable (perhaps old but not ancient) versions of python and bash are pre-installed but still not yet the case reliably on windows AFAIK.
                                            • I don’t believe there’s much of any sharing of architecture or implementation code for common cross-language needs in a modern environment like secure efficient downloads, caching, integrity checking, archiving, compressing, etc.
                                            1. 4

                                              Aren’t build systems supposed to be pure functions? Given a set of inputs, do a thing that provides outputs. I want to explicitly define the inputs. I want to explicitly define the outputs. The build system should hold me to that. Let me Define the concept of a filename. Let me Define the programs that need to be used within it. Let me script the goram thing when I basically just want a step that is shell script!

                                              I spent much of the last four years in Gradle and Maven. I spent much of the last year in make and SBT. I recently discovered tput and I’m trying to clean up a nasty makefile enough that I could move over to it.

                                              Unfortunately, I just keep on going back to make. It’s everywhere. It has reliable and readable syntax, for the most part. The differences between GNU make and BSD make are subtle and rarely encountered, at least for my not-actually-compiling-c uses. I just can’t quit you, baby.

                                              1. 2

                                                “Unfortunately, I just keep on going back to make.”

                                                One could always create a language that is functional like you describe but compiles to make. Likewise, make itself could be redone in a better language where the implementation API is close to make language, does lots of checks, runs in its native form for testing/debugging, and can similarly extract make when done. One person even did a make in Prolog.

                                                1. 2

                                                  Interestingly enough, I’ve been planning out a tool for my specific purpose that builds a Makefile based on some inputs in a YAML or JSON file. Kinda like a ./configure script that only has to be run when new major components are added.

                                                  I want to do this to clean up financial reports stored in ledger format that I generate using make and pandoc.

                                                  The config would be something like this:

                                                  ledger:
                                                    prices: prices.db
                                                    reporting_dir: reports
                                                    transactions: transactions.ledger
                                                  reports:
                                                    - id: net_worth
                                                      query: balance Assets Liabilities --market
                                                      header: Net Worth
                                                      desc: "This shows net worth"
                                                    - id: cashflow
                                                      query: balance Income Expenses --period ${YEAR} --invert
                                                      header: Cashflow
                                                      desc: "This shows total cashflow. A positive amount indicates a surplus."
                                                  

                                                  and it would produce a Makefile that looks something like this:

                                                  REPORTING_DIR = reports
                                                  TRANS_FILE = $(shell yaml_lookup -f config.cfg ledger.transactions)
                                                  LEDGER = ledger
                                                  LEDGER_WITH_FILE = ledger -f $(TRANS_FILE)
                                                  
                                                  all: $(REPORTING_DIR)/statement.html
                                                  
                                                  $(REPORTING_DIR)/statement.html: $(REPORTING_DIR)/net_worth.md $(REPORTING_DIR)/cashflow.md
                                                      cat $< | pandoc -t html > $@
                                                  
                                                  $(REPORTING_DIR)/net_worth.md:
                                                      echo -n "# " > $@
                                                      yaml_lookup -f config.cfg reports[id = 'net_worth'].header > $@
                                                      echo > $@
                                                      yaml_lookup -f config.cfg reports[id = 'net_worth'].desc > $@
                                                      echo -e "\n\`\`\`" > $@
                                                      $(LEDGER_WITH_FILE) $(shell yaml_lookup -f config.cfg reports[id = 'net_worth'].query) > $@
                                                      echo -e "\n\`\`\`" > $@
                                                  
                                                  $(REPORTING_DIR)/cashflow.md:
                                                      echo -n "# " > $@
                                                      yaml_lookup -f config.cfg reports[id = 'cashflow'].header > $@
                                                      echo > $@
                                                      yaml_lookup -f config.cfg reports[id = 'cashflow'].desc > $@
                                                      echo -e "\n\`\`\`" > $@
                                                      $(LEDGER_WITH_FILE) $(shell yaml_lookup -f config.cfg reports[id = 'cashflow'].query) > $@
                                                      echo -e "\n\`\`\`" > $@
                                                  

                                                  Hell, could even do the yaml lookups in the Makefile build step.

                                                  1. 1

                                                    The first one looks so much clearer than the latter. That’s what benefit Im talking about. Same thing we do with SQL-to-internal-format and compiling high-level languages to C.

                                              2. 6

                                                GNU Autotools: just kill this horrific pile of garbage with fire. Especially terrible when libtool is used. Related: classic PHK rant.

                                                CMake: slightly weird language (at least a real language which is miles ahead of autocraptools), bad documentation.

                                                Meson: somewhat inflexible (you can’t even set global options like b_lundef conditionally in the script!) but mostly great.

                                                GYP: JSON files with conditions as strings?! Are you serious?

                                                Gradle: rather slow and heavy, and the structure/API seems pretty complex.

                                                Bazel/Buck/Pants (nearly the same thing): huge mega build systems for multiple languages that take over everything, often with little respect for these langauges’ build/package ecosystems. Does anyone outside Googlefacetwitter care about this?

                                                Grunt, Rake, many others: good task runners, but they’re not build systems. Do not use them to build.

                                                1. 6

                                                  Related: classic PHK rant.

                                                  This one is even better since its observations apply to even more FOSS than libtool. It also has some laughable details on that along with the person who wrote libtool apologizing in the comments IIRC.

                                                  1. 3

                                                    I recalled that too, but it was David McKenzie of Autoconf who popped up to apologize.

                                                    1. 1

                                                      Oh OK. Thanks for the correction. At least one owned up to their mess. :)

                                                  2. 3

                                                    FWIW: bazelbuckpants seem to be written for the proprietary software world: a place where people are hesitant to depend on open-source dependencies in general, and people have a real fear (maybe fear is strong, but still) of their dependencies and environment breaking their build. I use them when I’m consulting, because I can be relatively certain that the build will be exactly the same in a year or so and I don’t like having to fix compilation errors in software I wrote a year ago.

                                                    1. 2

                                                      I’m with you on Grunt, but Rake is actually a build tool with Make-style rules and recipes for building and rebuilding files when their dependencies change. There’s a case that Rake is just Make ported to Ruby syntax. It’s just more commonly used as a basic task runner.

                                                      https://ruby.github.io/rake/doc/rakefile_rdoc.html

                                                      1. 1

                                                        I think Make is also somewhat close to a task runner. It has dependencies, but not much else. You write compiler invocations manually…

                                                        1. 1

                                                          It sort of has default rules for building a number of languages, though these aren’t terribly helpful anymore.

                                                          I also use Make as task runner. Mostly to execute the actual build system, because everybody knows how to run make and most relevant systems probably have Make installed in one form or another.

                                                      2. 1

                                                        We use Pants here at Square, in our Java monorepo. It works quite nicely, actually. For our Go monorepo, we just use standard Go tooling, but I’ve volunteered to convert to Pants if anyone can get everyone to move to a single monorepo. They won’t, because every Rails project has its own repo, and the Rails folks like it that way.

                                                      3. 3

                                                        A few off the top of my head (I mostly have experience with Make, Nix and Cabal):

                                                        • Tools which assume they’re interacting with a user, making scripting harder. This is actually an umbrella problem for some of the others below.
                                                        • Special snowflake languages (config or script). The problem isn’t whether or not it’s hard for a person to learn the syntax; it’s that any new language is automatically incompatible with all existing tooling (parsers, pretty-printers, formatters, editor shortcuts, etc.). Just use JSON/XML/s-expressions unless there’s a very compelling reason not to (and “too many parens” isn’t a compelling reason; see e.g. sweet expressions, etc.). Guix seems to be the least offensive in this regard ;)
                                                        • Closed ecosystems. For example, Cabal (by default) looks up package names from Hackage. If I want to use e.g. a git repo, I need to add it using command invocation.
                                                        • Language-specific assumptions. If a Haskell package depends on other Haskell packages, Cabal can build it. If it depends on some non-Haskell package, then it can’t. Except for special snowflakes like zlib.
                                                        • Hard-coded special cases. For example, looking up some particular tools (e.g. happy) but not allowing users to specify their own tools to be looked up.
                                                        • Implicit changes in behaviour due to the environment, e.g. whether or not certain directories exist, or their contents, when those directories aren’t explicitly defined on the commandline or some env var. Cabal was notorious for this, prior to the “sandbox” feature.
                                                        1. 1

                                                          Closed ecosystems. For example, Cabal (by default) looks up package names from Hackage. If I want to use e.g. a git repo, I need to add it using command invocation.

                                                          Cabal is fixing this

                                                          Hard-coded special cases. For example, looking up some particular tools (e.g. happy) but not allowing users to specify their own tools to be looked up.

                                                          This too I think, with build-tool-depends

                                                        2. 3

                                                          The number one thing I hate about most build systems is that they aren’t hermetic. Some of them get closer than others. But aside from Bazel and Nix almost none of them are truly hermetic. (edit: Buck and Pant’s are similar to Bazel in their goals but I’ve not evaluated them)

                                                          Hermeticity

                                                          A build system should enforce hermeticity for any given build to the extent that the operating system and language supports it. If it doesn’t then a whole host of problems will occur. Including but not limited to.

                                                          • The It compiles on my machine problem.
                                                          • The It doesn’t compile on my machine problem.
                                                          • The I forgot to specify this build dependency problem.
                                                          • The I have to “make clean && make all” again problem.
                                                          Language agnostic hermeticity.

                                                          I almost never work on projects for work that are a single compiled language. At a minimum I’ll have to compile some variant of javascript. You can’t have a hermetic build for a project if you you need two separate build systems to compile your project. It’s nice when a language includes build tooling out of the box but as soon as you need to support multiple languages with dependencies between them you need a build system that supports them both and also enforces hermetic builds for them both.

                                                          Specify all the inputs.

                                                          You can’t have hermetic builds if you can’t specify all the inputs for a build and the flow of those inputs through the various task dependencies. The nice thing about this is that you get some amount of incremental builds for free. The build system can determine if any given build task has had it’s inputs change and just reuse the outputs if they haven’t. You also have a foundation for distributed builds and distributed build caching to help speed up builds even further for really large organizations and codebases.

                                                          1. 2

                                                            Someone needs to trash cargo, because it seems like a lot of people really like it. However, it’s a build system, so it must be bad.

                                                            1. 1

                                                              I don’t think they solved the Debian integration yet? Building without network access, using shared libraries, etc.

                                                            2. 2

                                                              Has anyone here tried xmake? I’m considering it for my personal projects. The configuration language is a Lua dialect and it looks about sane, at least for a build system (I have low expectations…)

                                                              1. 2

                                                                Linkers are slow.

                                                                1. 3

                                                                  bfd is slow. lld is pretty damn fast!

                                                                2. 1

                                                                  Could you please clarify if you are considering just plain Make or GNU Make extensions in your document? If GNU Make is considered, dependency on a directory is certainly possible with the | dependency.

                                                                  Also, could you please add Mk (from Plan9) to the list? It is supposed to be a better Make.

                                                                  1. 1

                                                                    Mk is Make reduced to its essentials. That removes a lot of cruft but also some useful features. People probably debate which things are cruft or useful, though.

                                                                  2. 1

                                                                    cmake: need to create build dir and compile in it (at least most projects using cmake expect that). That would be great feature but you are required to delete this directory and create it again if something goes wrong (the same make clean problem but on berserk mode).

                                                                    I still can’t figure out how its language works and why everything is based on setting global variables instead of return values. Some of these global variables are set in “cache” so you need to delete your build directory and start again.

                                                                    Most people are copying snippets from one config to other, just like in autotools.

                                                                    ninja: build targets are treated as real files with fixed paths on filesystem instead of some discardable data. But still great replacement for make to use with other build systems.

                                                                    bazel, buck: great concepts but not very usable for projects outside Google and Facebook. Very hard to include external dependencies — officially documented way to do it is to put source code of external dependency into repo of all projects of your megacorp (monorepo). Then create build files for each dependency by hand. For example cmake has very handy ExternalProject_Add. And almost no cross-compilation, I tried to make windows binaries with these tools with no success.

                                                                    Buckaroo is for adding external projects to Buck easily, but it doesn’t have concept of build flags (which is crucial in C world), and no cross-compilation support. At least it have good collection of Buck build files for various popular libs.

                                                                    P.S. I almost can’t write code in C/C++ but familiar with these tools just as user of software that uses these build tools.

                                                                    1. 2

                                                                      re: Bazel and external dependencies.

                                                                      The recommended ways are actually to use a workspace rule in your workspace file. The most common one being new_http_archive.

                                                                      In practice these work absolutely fine. You can also if you wish check in the whole dependency and get the benefits of a mono repo but it’s not necessary if a mono repo is not something you care about.

                                                                    2. 1

                                                                      I wrote 3 substantial makefiles from scratch and wrote a couple posts about GNU Make:

                                                                      http://www.oilshell.org/blog/2017/05/31.html – Build System Observations

                                                                      http://www.oilshell.org/blog/2017/10/25.html – Comments About Shell, Awk, and Makw

                                                                      I believe that shell has a horrible syntax but good semantics. Make has both bad syntax and bad semantics. It is conceptually flawed – as I mention and someone else mentioned in this thread, it has a confused execution model that is not quite “functional”.

                                                                      I still want to build tool for Oil / with Oil that takes advantage of having a complete shell parser, but for this reason, I don’t think it will follow the philosophy of being very compatible. It will necessarily have different semantics.

                                                                      http://www.oilshell.org/blog/2016/11/14.html

                                                                      1. 1

                                                                        Another interesting make flavor is BSD bmake which allows monitoring and updating of dependencies. It would be nice to see that also in the list.

                                                                        1. 1

                                                                          My point what is annoying about Make: by design, Make is stateless: it only looks at the file system to understand the state of a build. This is often not enough in a complex build. For example, just by looking at a file.o one does not know what flags were in effect when it was built. It is too tedious to encode this knowledge into the file name or directory structure (which Make is not good at handling anyway). However, a complex build will compile files with different flags for performance or debugging. I conclude that a build system should keep state between invocations to track this and any build system that does not needs a clever way to lay out artefacts in the file system to simulate it.

                                                                          1. 1

                                                                            Today I found this blog post about Make, which I agree with: http://nibblestew.blogspot.co.uk/2017/12/a-simple-makefile-is-unicorn.html

                                                                          2. 1

                                                                            Everything that is not make: I’d like not having to install external software just to build yours, thank you.

                                                                            autotools: It’s 2017. If you check for rudimentary C89 compliance anyway, I think you can expect that stdlib.h does, in fact, exist. I’d rather have a solution for missing strlcpy, strlcat, the arc4random family that also provides me with something I can actually use. If I have a define that it’s broken, I’ll have to ship my own anyway and at that point there’s no reason to actually use the stdlib. And if you could stop checking for FreeBSD 2.x in a C11 codebase, that’s be great.

                                                                            make: Writing portable/POSIX-conformant Makefiles, i.e. ones that do not require to install GNU make on a system, is hell. A lot of the useful implicit variables are GNU extensions. Autotools forces GNU make anyway, too.

                                                                            Non-GNU make (OpenBSD, I’m looking at you): Seriously, can we have some of the GNU implicit variables yet? Thank you.

                                                                            CMake: Terrible documentation. I suspect that’s because they’re trying to sell their CMake book.