If you like Make (as opposed to shell bits embedded in JSON!), you’ll probably like Ninja even more. It looks like a Makefile, and for simple things you can write it by hand. But it’s also nice to generate it from Python, Ruby, or any language.
Unfortunately there don’t seem to many good tutorials on the web, since it’s mainly used as a low-level component of Cmake and Meson and so forth, but it’s very easy, and the docs are good:
cflags = -Wall
rule cc
command = gcc $cflags -c $in -o $out
build foo.o: cc foo.c
I wrote 3 Makefiles from scratch for Oil a few years ago (100 - 500 lines each), and largely regret it. I recently ported a huge portion to Ninja, and then realized I should have just started with that.
The idea behind Make is sensible, but the tool itself (including GNU make and others) is fundamentally flawed and limited. The language doesn’t compose, which means you get more and more bugs as the Makefile grows.
On the other hand, Ninja is faster and scales well because it has so few features, and you can debug it by reading it.
It’s basically a faster, simpler, “cleaned up” Make.
Maybe another misconception about Ninja is that it is primarily for building C/C++. It does have features for that use case, but so does Make, but otherwise it’s a generic build tool based on timestamps.
All of these should be incremental and parallel, so all of them are a good fit for Ninja. I started with a trivial wrapper and then ended up with a more elaborate one:
For the extremely trivial (e.g., providing standardized make build for all your org’s projects, which just directly call cargo build / cabal build / whatever build), I think make is overall better because it’s almost definitely already installed.
But I think if one’s scripting needs seem to be growing at all beyond the extremely trivial, ninja is one of the first places to look.
I’ll admit I don’t know webpack very well since I’m not primarily a front end developer. However every single time I touch webpack and something even it I don’t, the whole thing exploded into an unintelligible spaghetti monster. I’d rather deal with C++ template errors than anything touching webpack.
To be fair, we’re all talking about C/C++ builds, and the OP isn’t. However I believe Ninja would also be better for web builds, as my other comment here explains. (You’ll be reinventing stuff that is “canned” in typical JS tools, but you’re also doing that with Make.)
Makefiles get a LOT of guilt-by-association from autotools. For years I avoided learning about them because I didn’t want to get sucked into a black hole of insanity, but it turns out if you use Make on its own, (not for C or C++) it’s great!
The FreeBSD build system is pure bmake, no autotools or anything else. One time, I wanted to rewrite a file that yacc included in C++, requiring a change to the rule that compiled the yacc output to compile as C++. After half a day of trying, I gave up.
GNUstep Make uses some autotools stuff when you install it but is then a pure gmake environment. I suffered with it for years because it makes a bunch of assumptions that are never quite what I want and is a huge pain to use for anything that isn’t exactly what it was intended for.
I have used large build systems in pure make, in various dialects, and they have always been the ones where I have had to deal with the most fragility (impossible to change, parallel builds subtly broken, and so on). CMake and friends are bad, but I’ve never gone beyond the level of mild dislike with a CMake build system for a project with thousands of build steps. I’ve gone to utter frustration and despair with pure make build systems under a tenth that complexity.
I agree with guilt-by-association, but I disagree that Make is great on its own, even for the non C/C++ use cases.
All three Makefiles I wrote were without autotools/ or anything like m4/CMake/etc. It was all just plain Make, and it had big problems too.
One issue is that I don’t want to write a new build rule for every single blog post I write, and for every new log file I get, which looks like 2023-02-01.access.log.
Turns out this feature interacts very poorly with others. It’s a crappy programming language with one level of looping. I spent forever fixing bugs in stuff like this, and it’s probably still not right.
The alternative is I just write a simple for loop in Python that generates Ninja, and I’m done in 5 minutes.
I can even write NESTED loops! Which I do for (gcc, clang) x (dbg, release, asan). Build variants are trivial with this code gen pattern, but tortured in pure Make.
You might ask: why not generate Make from Python? You could, and that’s essentially what CMake did, what Android platform did, and many other build systems did too.
But CMake now generates Ninja, and so does the Android platform (and Chrome too).
Ninja basically has all the parts of Make that you would generate – that is its purpose.
It doesn’t have all the gunk of implicit rules.
One way to see this is that your 5 line Makefile is actually 1305 lines long, and your performance suffers because of it (extra stats), and you have to debug it occasionally:
$ make --print-data-base | wc -l
...
1300
Another way to see this is by strace:
$ strace -e stat ninja
ninja: no work to do.
+++ exited with 0 +++
$ strace -e stat make
stat("/dev/pts/11", {st_mode=S_IFCHR|0620, st_rdev=makedev(136, 11), ...}) = 0
stat("/dev/pts/11", {st_mode=S_IFCHR|0620, st_rdev=makedev(136, 11), ...}) = 0
stat("/usr/include", {st_mode=S_IFDIR|0755, st_size=20480, ...}) = 0
stat("/usr/gnu/include", 0x7ffd60f40e60) = -1 ENOENT (No such file or directory)
stat("/usr/local/include", {st_mode=S_IFDIR|0755, st_size=4096, ...}) = 0
stat("/usr/include", {st_mode=S_IFDIR|0755, st_size=20480, ...}) = 0
stat(".", {st_mode=S_IFDIR|0775, st_size=4096, ...}) = 0
stat("RCS", 0x7ffd60f40e00) = -1 ENOENT (No such file or directory)
stat("SCCS", 0x7ffd60f40e00) = -1 ENOENT (No such file or directory)
stat("GNUmakefile", 0x7ffd60f3ed40) = -1 ENOENT (No such file or directory)
stat("makefile", 0x7ffd60f3ed40) = -1 ENOENT (No such file or directory)
stat("Makefile", 0x7ffd60f3ed40) = -1 ENOENT (No such file or directory)
make: *** No targets specified and no makefile found. Stop.
+++ exited with 2 +++
OK but I don’t understand why you did it that way. I use it for my blog and I don’t write a new build rule for every blog post I make, but none of that stuff is necessary for me.
I mean, yeah, don’t use it for something it’s not suited for, but I can’t tell from your explanation why your blog is badly suited for it and mine isn’t. It seems like you have some unspoken requirement beyond “build a blog” that is forcing you to complicate things, but without knowing what it is, I can’t comment further.
Another way to see this is by strace
I author my blog on a thinkpad from 2008. Running make takes under a hundred milliseconds. Why would I care if ninja is faster?
How do you use Make for your blog without pattern rules? I’d be interested to see what it looks like
My blog still uses Make, so I’m not that surprised if other people use it successfully … certainly it seems better than non-incremental and non-parallel “static site generators”, which seem to be written in Go because Ruby is too slow (???)
I just really wish I had used Ninja from the beginning.
That isn’t the only problem I ran into, looking at my Makefile, I also have .SECONDARY which I think fixed a bug (wrong default). And of course many people forget .PHONY
The tagging and TOC was a bit hard to get right IIRC
Also I seem to be scared to actually build the blog in parallel, but I don’t know if that is real or not :) Make doesn’t help you get your dependencies right
Ninja doesn’t either, but in practice, since it builds in parallel by default, the builds seem to be more correct. (I want to add some lightweight sandboxing to my Ninja wrapper to fix this)
My table of contents generation is also terrible, but that’s because m4 is bad, not because Make is bad.
On the other hand, having admitted to willingly using m4 I guess I’ve basically lost whatever credibility I had, so let me clarify that I don’t actually endorse m4, but I do use it, and it’s bad in ways that don’t actively cause problems for me.
Every build system I’ve encountered eventually feels like that kind of intestinal distress. Make is just the one that I know that I need to take a little antacid beforehand, and I’ll be OK. Rake is a really good experience, too.
Very reasonable observations. I find more and more developers, even the “younger generation” get burned on these hyper-specialized build systems and fall back to Make more and more often. I think it’s a good thing. Make is clunky but, as the poster notes, it does the job you ask it to do and you know it will continue to do it ten years from now.
Make also has a bunch of problematic things. The biggest one is that it has no way of expressing a rule that produces more than one output but it also has no way of tracking staleness other than modification times. It also can’t express things that may need more than a single run. You can’t build LaTeX reliably with Make, for example, because it does a single pass and must be rerun until it reaches a fixed point. You often end up with fake files to express things that will run once, such as patching files.
The annoying thing is that many of the complex replacements don’t solve these problems.
I’ve recently started using just which - as per their docs - “avoids much of make’s complexity and idiosyncrasies.”. Based on my limited use it looks like a promising alternative.
It’s a handy tool but it has a major omission in my opinion: no freshness tracking. It always runs all the commands, it doesn’t track whether a task’s dependencies are up to day and running the command can be skipped.
That’s why there’s latex-mk - it is a program that simply runs LaTeX the necessary number of times. It is also a large set of helpful Make definitions for TeX files so you don’t even need to teach it how to build. It knows about all the LaTeX flavours and related tools like pdflatex, tex2page, bibtex etc. The simplest possible latex-mk file is simply
NAME = foo
include /usr/local/share/latex-mk/latex.gmk
Then running make pdf, make ps etc would build foo.pdf, foo.ps etc from foo.tex, but it can be as complex as you want it to be.
I use latex-mk, but it also has problems. For example, I was never able to work our how to hook it so that it can run steps to produce files that a TeX file includes if they don’t exist.
That’s a bit of an odd requirement. What kind of situation requires that? I guess you could run some nasty eval to expand to make targets based on awk or grep output from your LaTeX sources, in GNU Make at least.
Basically every LaTeX document I write pulls in things from elsewhere. For example, most of the figures in my books were drawn with OmniGraffle and converted to pdf with a makefile. I want that to be driven from latex-mk so that it can run that rule if I actually include the PDF (and so I don’t have to maintain a build system that drives a build system with limited visibility). For papers, there’s usually a build step that consumes raw data and runs some statistics to produce something that can be included. Again, that ends up being driven from a wrapper build.
It’s been a long time since I worked on a LaTeX-only codebase requiring multiple compilation passes. I’m spoiled by pandoc + markdown for most of the documents I must write. I’ve heard that pandoc is a competent layer for a (la)tex -> pdf compiler instead of using pdflatex or xelatex or whatever directly. Have you seen pandoc being used in that way, primarily to avoid the multiple compilation pass madness behind pandoc’s abstraction thereof? I’ve also used tectonic a bit for debugging more complex pandoc markdown->tex->pdf builds, and it abstracts away the need for multiple passes.
I use TeX primarily because most academic venues offer only LaTeX or Word templates and it’s the lesser of two evils. If I didn’t have to match an existing style, I’d use SILE.
The annoying thing is that many of the complex replacements don’t solve these problems.
I guess build2 would qualify as one of those complex replacements. Let’s see:
The biggest one is that it has no way of expressing a rule that produces more than one output
Check: we have a notion of target groups. You can even have groups where the set of member is discovered dynamically.
also has no way of tracking staleness other than modification times
Check: a rule is free to use whatever method it sees fit. We also keep track of changes to options, set of inputs, environment variables that affect a tool, etc.
For example, we have the venerable in rule which keeps track of changes to the variable values that it substitutes in the .in file.
It also can’t express things that may need more than a single run.
Hm, I don’t know, this feels like a questionable design choice in a tool, not in a build system. And IMO the sane way to deal with this is to just run the tool a sufficient number of times from a recipe, say, in a loop.
Let me also throw some more problematic things in make off the top of my head:
Everything is a string, no types in the language.
No support for scoping/namespaces, everything is global (hurts especially badly in non-recursive setups).
Recipe language (shell) is not portable. In particular, it’s unusable on Windows without another “system” (MSYS2, Cygwin, etc).
Support for separate source/output directories is a hack (VPATH).
No abstraction over file extensions (so you end with with hello$(EXE) all over the place).
Agreed on all of the other criticisms of Make. I’m a bit surprised that build2 can’t handle the dynamic dependency case, since I thought you needed that for your approach to handling C++ modules.
I’d be interested in whether build2 can reproduce latex-mk’s behaviour. A few interesting things:
latex needs rerunning if it still has unresolved cross references, but not if the number doesn’t go down.
bibtex needs running if latex complained about a missing bbl file or before running latex if a bib file used by a prior run has changed.
There are a few more subtleties. Doing the naive thing of always running latex bibtex latex latex takes build times from mildly annoying to an impediment to productive work, so is not an acceptable option. Latex-mk exists, so there’s no real need for build2 to be able to do this (though being able to build my experiments, the thing that analyses the result, the graphs, and the final paper from a single build system would be nice), but there are a lot of things such as caching and generated dependencies that can introduce this kind of pattern and I thought it was something build2 was designed to support.
I’m a bit surprised that build2 can’t handle the dynamic dependency case, since I thought you needed that for your approach to handling C++ modules.
It can’t? I thought I’ve implemented that. And we do support C++20 modules somehow (at least with GCC). Unless we are talking about different “dynamic dependencies”. Here is what I am referring to: https://build2.org/release/0.15.0.xhtml#dyndep
Doing the naive thing of always running latex bibtex latex latex […]
I am probably missing something here, but why can’t all this be handled within a single recipe or a few cooperating recipes, something along these lines:
latex
if (latext_complained_about_missing_bbl)
bibtex
latex
end
while (number_of_unresolved_cross_references_is_not_zero_and_keeps_decreasing)
latex
end
Am a big fan of Make. It is clunky, hard to debug, but it sits just at the right level of abstraction. I’ve seen more and more posts of people realizing it’s useful beyond the original use case of compile C. There is room, I think, for a successor that addresses its flaws (see David’s comment) and expands to cover modern use cases (distribution, reproducibility, scheduling, orchestration). The challenge is in finding a compact set of primitives to support that and keep it simple, ie. not Bazel.
You may want to read Build Systems à la Carte or watch the talk about it by Simon Peyton Jones (audio warning: it’s quite bad). Shake seems to be what you’re looking for, unfortunately you have to write the Shakefile in Haskell and have GHC installed which can be a bit steep as requirement.
Circling back to JS, I had a half idea to use the Shake model described in that paper to implement it in JS so I could replace Jake, which is a good tool but shares many of the problems that Make has.
Every codebase benefits from a Makefile that enables minimally: make deps check test build clean all.
deps installs dependencies needed for development and building. check runs static analysis tools like linters, etc. without compiling. test runs tests. build packages a distributable. clean deletes any side effects of check, test, or build, and all does everything all at once for a minimal “sanitary build environment -> ship it” process.
Make, at least in the UNIX world, is the thing that everyone has installed. It is the greatest common divisor, perhaps. I’ve even got it working for Windows in a few projects, where the barrier to entry is “install Make somehow; we’ll take care of the rest.”
I’ve used Make in Ruby, Scala, Java, Groovy, Rust, Python, JavaScript, Pandoc+Markdown, PHP, and several other stacks. It always beats a build script and always solves the problem of onboarding faster than a README.
Look, I like make, I’ve used make a bunch, I’ve even used it for web projects!
But I’m not sure the problem here was webpack. OP made the choice to use a symlink to share code, and that’s not an approach that this build system supports. (I think the supported alternative would be a third package; though I could also imagine this done with git submodules or something.)
To me this is a little like someone complaining #include "https://example.com/some/code.h" doesn’t work with gcc. Like, you’re right, and maybe it’d be nice if it did, but you’re not really using the system in the way that’s expected.
The supported way is to create a package (which I did) and host your own local package repository. It’s absolutely mad. Gitsubmodules don’t fix anything here, the shared code has to live at the same level as the other 2 repositories. And yes, it’s just not webpack, it’s everything.
It’s true that local paths are not supported for published packages (how could they be?) but I think if you’re going to publish, presumably you’d publish the dependency, which would let you use the published name as the dependency.
it’s just not webpack, it’s everything
I’m not sure what this means. I understand you encountered some frustration when attempting to solve a problem, and you found a solution that works for you. I’m glad! What’s not clear to me is that this is any kind of argument against webpack – or indeed for make. It seems to be an argument for “use tools that you like,” which could perhaps have been framed in a less inflammatory way.
All right, I guess you want to get into this, as if my experience is not evidence of the deficiency here, which is dismissive. :)
Thanks for assuming I hadn’t already looked at local paths or git URLs. The problem with these is code will be pulled into your node_modules, and you must re-install each time on changes in the shared code. It makes for a terribly slow experience and having to do an obscene and unnecessary amount of package version releases. I value my time thank you.
You read the short text or not? It’s clear as day that the tooling around the web technologies did not work very well at all for the shared code problem and the docker problem I was having. It’s not just an argument about webpack when clearly other tools were mentioned.
It’s a data point that web tooling still sucks ass, not an argument.
Yeah it’s inflammatory. I’m a human being with emotions. No it doesn’t need to be framed in any other way. It’s one tiny signal to the world that this stuff still sucks. This also isn’t world or business politics.
Yeah I’m not interested in getting into a flamewar. I’ve tried to understand your perspective, so I can come to a conclusion: are these tools fundamentally broken? Were they being misused? Is this an edge case that is deliberately unsupported? What are the tradeoffs involved in your decision? When would it be appropriate for me to make the same decision, and when should I avoid it?
You don’t seem to be interested in that conversation. You seem angry, and insistent that web tooling “sucks”. I disagree; though I tend to think any claim that some technology “sucks” is vacuous on its face anyway. Your story is not a useful data point to me, because you haven’t engaged with those questions that would make it so.
Same here, but there’s some condescending undertone here in everything you write, so quite frankly, you’re asking for it.
I’m interested in hearing about alternative solutions to the problem, not “HUH DID THEY EVEN USE IT RIGHT????”. See: people talking about ninja, build2, cmake, etc. Way more useful commentary.
More people should try Deno. It would wipe out at least two thirds of the frustrations this list.
start a project
start another project
“Starting a project” in Deno is “create a file ending with .ts”. It’s basically free.
realize shared code between both projects, create another project
symlink to the shared code
typescript compiler rejects symlink code, “out of rootdir”
add rootdirs
Deno uses ESM for module resolution, so importing code from another project is just import * from "../other_module/mod.ts";
tools like ts-node dont support –traceResolution to debug things
use tsc directly
Deno just runs .ts files, has debugger support, and stack traces pointing to your .ts source.
try workspaces, mull over the countless custom solutions (lerna, nx, …)
obviously nothing works, they all rely on symlinks
You don’t need workspaces if you don’t have a dependency chain of build steps or per-project node_modules.
And if you’re building web projects, you can use light tools like esbuild to create a bundle which runs in the browser. Never looking back!
This comment definitely piqued my interest in using Deno as a toolchain for things, be they web or not.
I’m also starting to see too-complex build processes as mostly self-inflicted wounds. The pain stops when you take actions to do so and insist on tools that limit said complexity.
I’ve been using a pretty minimal setup lately with Ninja and esbuild. I still use tsc but for just for type checking. My builds are done faster than a npm run script can even start. The esbuild meta gives you the build graph and it’s straightforward to then generate depends like gcc -MD. There’s a ‘ninja-builder’ package on NPM that makes building the rules really straightforward. Overall I’m really happy with the setup.
If you like Make (as opposed to shell bits embedded in JSON!), you’ll probably like Ninja even more. It looks like a Makefile, and for simple things you can write it by hand. But it’s also nice to generate it from Python, Ruby, or any language.
Unfortunately there don’t seem to many good tutorials on the web, since it’s mainly used as a low-level component of Cmake and Meson and so forth, but it’s very easy, and the docs are good:
https://ninja-build.org/manual.html#_writing_your_own_ninja_files
I wrote 3 Makefiles from scratch for Oil a few years ago (100 - 500 lines each), and largely regret it. I recently ported a huge portion to Ninja, and then realized I should have just started with that.
The idea behind Make is sensible, but the tool itself (including GNU make and others) is fundamentally flawed and limited. The language doesn’t compose, which means you get more and more bugs as the Makefile grows.
On the other hand, Ninja is faster and scales well because it has so few features, and you can debug it by reading it.
It’s basically a faster, simpler, “cleaned up” Make.
Maybe another misconception about Ninja is that it is primarily for building C/C++. It does have features for that use case, but so does Make, but otherwise it’s a generic build tool based on timestamps.
FWIW the use cases I have are:
All of these should be incremental and parallel, so all of them are a good fit for Ninja. I started with a trivial wrapper and then ended up with a more elaborate one:
https://www.oilshell.org/blog/2022/10/garbage-collector.html#declarative-ninja-mini-bazel
+1 to this.
For the extremely trivial (e.g., providing standardized
make build
for all your org’s projects, which just directly callcargo build
/cabal build
/whatever build
), I think make is overall better because it’s almost definitely already installed.But I think if one’s scripting needs seem to be growing at all beyond the extremely trivial, ninja is one of the first places to look.
Damn that does sound nice. I’ve seen ninja throughout the years but never really looked into it!
I’d rather construct a build graph out of my own intestines than use make
Even that would be an improvement over webpack and friends
I’ll admit I don’t know webpack very well since I’m not primarily a front end developer. However every single time I touch webpack and something even it I don’t, the whole thing exploded into an unintelligible spaghetti monster. I’d rather deal with C++ template errors than anything touching webpack.
Agree
Reminds me of what @david_chisnall said a few weeks ago:
https://lobste.rs/s/fdcpy3/fish_shell_be_rewritten_rust#c_bpiwtu
To be fair, we’re all talking about C/C++ builds, and the OP isn’t. However I believe Ninja would also be better for web builds, as my other comment here explains. (You’ll be reinventing stuff that is “canned” in typical JS tools, but you’re also doing that with Make.)
Makefiles get a LOT of guilt-by-association from autotools. For years I avoided learning about them because I didn’t want to get sucked into a black hole of insanity, but it turns out if you use Make on its own, (not for C or C++) it’s great!
The FreeBSD build system is pure bmake, no autotools or anything else. One time, I wanted to rewrite a file that yacc included in C++, requiring a change to the rule that compiled the yacc output to compile as C++. After half a day of trying, I gave up.
GNUstep Make uses some autotools stuff when you install it but is then a pure gmake environment. I suffered with it for years because it makes a bunch of assumptions that are never quite what I want and is a huge pain to use for anything that isn’t exactly what it was intended for.
I have used large build systems in pure make, in various dialects, and they have always been the ones where I have had to deal with the most fragility (impossible to change, parallel builds subtly broken, and so on). CMake and friends are bad, but I’ve never gone beyond the level of mild dislike with a CMake build system for a project with thousands of build steps. I’ve gone to utter frustration and despair with pure make build systems under a tenth that complexity.
I agree with guilt-by-association, but I disagree that Make is great on its own, even for the non C/C++ use cases.
All three Makefiles I wrote were without autotools/ or anything like m4/CMake/etc. It was all just plain Make, and it had big problems too.
One issue is that I don’t want to write a new build rule for every single blog post I write, and for every new log file I get, which looks like
2023-02-01.access.log
.So I used pattern rules with
%
in Make, likeTurns out this feature interacts very poorly with others. It’s a crappy programming language with one level of looping. I spent forever fixing bugs in stuff like this, and it’s probably still not right.
The alternative is I just write a simple
for
loop in Python that generates Ninja, and I’m done in 5 minutes.I can even write NESTED loops! Which I do for (gcc, clang) x (dbg, release, asan). Build variants are trivial with this code gen pattern, but tortured in pure Make.
You might ask: why not generate Make from Python? You could, and that’s essentially what CMake did, what Android platform did, and many other build systems did too.
But CMake now generates Ninja, and so does the Android platform (and Chrome too).
Ninja basically has all the parts of Make that you would generate – that is its purpose.
It doesn’t have all the gunk of implicit rules.
One way to see this is that your 5 line Makefile is actually 1305 lines long, and your performance suffers because of it (extra stats), and you have to debug it occasionally:
Another way to see this is by strace:
OK but I don’t understand why you did it that way. I use it for my blog and I don’t write a new build rule for every blog post I make, but none of that stuff is necessary for me.
I mean, yeah, don’t use it for something it’s not suited for, but I can’t tell from your explanation why your blog is badly suited for it and mine isn’t. It seems like you have some unspoken requirement beyond “build a blog” that is forcing you to complicate things, but without knowing what it is, I can’t comment further.
I author my blog on a thinkpad from 2008. Running
make
takes under a hundred milliseconds. Why would I care if ninja is faster?How do you use Make for your blog without pattern rules? I’d be interested to see what it looks like
My blog still uses Make, so I’m not that surprised if other people use it successfully … certainly it seems better than non-incremental and non-parallel “static site generators”, which seem to be written in Go because Ruby is too slow (???)
I just really wish I had used Ninja from the beginning.
That isn’t the only problem I ran into, looking at my Makefile, I also have
.SECONDARY
which I think fixed a bug (wrong default). And of course many people forget.PHONY
The tagging and TOC was a bit hard to get right IIRC
Also I seem to be scared to actually build the blog in parallel, but I don’t know if that is real or not :) Make doesn’t help you get your dependencies right
Ninja doesn’t either, but in practice, since it builds in parallel by default, the builds seem to be more correct. (I want to add some lightweight sandboxing to my Ninja wrapper to fix this)
I didn’t mean to say I didn’t use pattern rules; just that I didn’t need any weird rules.
I’ve been using this 15-line Makefile for the last 5 years:
My table of contents generation is also terrible, but that’s because m4 is bad, not because Make is bad.
On the other hand, having admitted to willingly using m4 I guess I’ve basically lost whatever credibility I had, so let me clarify that I don’t actually endorse m4, but I do use it, and it’s bad in ways that don’t actively cause problems for me.
Why? I find makefiles without implicit rules literally the only sane mechanism to automate builds.
what would that even look like?
Would you like to see?
I’m afraid to see what a build graph made out of intestines would look like lol
Every build system I’ve encountered eventually feels like that kind of intestinal distress. Make is just the one that I know that I need to take a little antacid beforehand, and I’ll be OK. Rake is a really good experience, too.
Very reasonable observations. I find more and more developers, even the “younger generation” get burned on these hyper-specialized build systems and fall back to Make more and more often. I think it’s a good thing. Make is clunky but, as the poster notes, it does the job you ask it to do and you know it will continue to do it ten years from now.
Make also has a bunch of problematic things. The biggest one is that it has no way of expressing a rule that produces more than one output but it also has no way of tracking staleness other than modification times. It also can’t express things that may need more than a single run. You can’t build LaTeX reliably with Make, for example, because it does a single pass and must be rerun until it reaches a fixed point. You often end up with fake files to express things that will run once, such as patching files.
The annoying thing is that many of the complex replacements don’t solve these problems.
GNU make supports rules that produce more than one output. See “Rules with Grouped Targets” on this page.
I’ve recently started using just which - as per their docs - “avoids much of make’s complexity and idiosyncrasies.”. Based on my limited use it looks like a promising alternative.
It’s a handy tool but it has a major omission in my opinion: no freshness tracking. It always runs all the commands, it doesn’t track whether a task’s dependencies are up to day and running the command can be skipped.
That’s why there’s latex-mk - it is a program that simply runs LaTeX the necessary number of times. It is also a large set of helpful Make definitions for TeX files so you don’t even need to teach it how to build. It knows about all the LaTeX flavours and related tools like pdflatex, tex2page, bibtex etc. The simplest possible latex-mk file is simply
Then running
make pdf
,make ps
etc would buildfoo.pdf
,foo.ps
etc fromfoo.tex
, but it can be as complex as you want it to be.I use latex-mk, but it also has problems. For example, I was never able to work our how to hook it so that it can run steps to produce files that a TeX file includes if they don’t exist.
That’s a bit of an odd requirement. What kind of situation requires that? I guess you could run some nasty
eval
to expand to make targets based on awk or grep output from your LaTeX sources, in GNU Make at least.Basically every LaTeX document I write pulls in things from elsewhere. For example, most of the figures in my books were drawn with OmniGraffle and converted to pdf with a makefile. I want that to be driven from latex-mk so that it can run that rule if I actually include the PDF (and so I don’t have to maintain a build system that drives a build system with limited visibility). For papers, there’s usually a build step that consumes raw data and runs some statistics to produce something that can be included. Again, that ends up being driven from a wrapper build.
It’s been a long time since I worked on a LaTeX-only codebase requiring multiple compilation passes. I’m spoiled by pandoc + markdown for most of the documents I must write. I’ve heard that pandoc is a competent layer for a (la)tex -> pdf compiler instead of using
pdflatex
orxelatex
or whatever directly. Have you seen pandoc being used in that way, primarily to avoid the multiple compilation pass madness behind pandoc’s abstraction thereof? I’ve also usedtectonic
a bit for debugging more complex pandoc markdown->tex->pdf builds, and it abstracts away the need for multiple passes.I’ve been able to use pandoc to compile markdown books, but I struggled to use it well with TikZ or Beamer. LaTeX just has too many dark corners.
I use TeX primarily because most academic venues offer only LaTeX or Word templates and it’s the lesser of two evils. If I didn’t have to match an existing style, I’d use SILE.
I guess
build2
would qualify as one of those complex replacements. Let’s see:Check: we have a notion of target groups. You can even have groups where the set of member is discovered dynamically.
Check: a rule is free to use whatever method it sees fit. We also keep track of changes to options, set of inputs, environment variables that affect a tool, etc.
For example, we have the venerable
in
rule which keeps track of changes to the variable values that it substitutes in the.in
file.Hm, I don’t know, this feels like a questionable design choice in a tool, not in a build system. And IMO the sane way to deal with this is to just run the tool a sufficient number of times from a recipe, say, in a loop.
Let me also throw some more problematic things in
make
off the top of my head:Everything is a string, no types in the language.
No support for scoping/namespaces, everything is global (hurts especially badly in non-recursive setups).
Recipe language (shell) is not portable. In particular, it’s unusable on Windows without another “system” (MSYS2, Cygwin, etc).
Support for separate source/output directories is a hack (
VPATH
).No abstraction over file extensions (so you end with with
hello$(EXE)
all over the place).Pattern rules do not support multiple stems (in
build2
we have regex-based pattern rules which are a lot more flexible: https://build2.org/release/0.14.0.xhtml#adhoc-rules).Agreed on all of the other criticisms of Make. I’m a bit surprised that build2 can’t handle the dynamic dependency case, since I thought you needed that for your approach to handling C++ modules.
I’d be interested in whether build2 can reproduce latex-mk’s behaviour. A few interesting things:
There are a few more subtleties. Doing the naive thing of always running latex bibtex latex latex takes build times from mildly annoying to an impediment to productive work, so is not an acceptable option. Latex-mk exists, so there’s no real need for build2 to be able to do this (though being able to build my experiments, the thing that analyses the result, the graphs, and the final paper from a single build system would be nice), but there are a lot of things such as caching and generated dependencies that can introduce this kind of pattern and I thought it was something build2 was designed to support.
It can’t? I thought I’ve implemented that. And we do support C++20 modules somehow (at least with GCC). Unless we are talking about different “dynamic dependencies”. Here is what I am referring to: https://build2.org/release/0.15.0.xhtml#dyndep
I am probably missing something here, but why can’t all this be handled within a single recipe or a few cooperating recipes, something along these lines:
Am a big fan of Make. It is clunky, hard to debug, but it sits just at the right level of abstraction. I’ve seen more and more posts of people realizing it’s useful beyond the original use case of compile C. There is room, I think, for a successor that addresses its flaws (see David’s comment) and expands to cover modern use cases (distribution, reproducibility, scheduling, orchestration). The challenge is in finding a compact set of primitives to support that and keep it simple, ie. not Bazel.
Ninja, maybe? That’s my hope at least. I like its approach of “do exactly what it’s told, and use a higher level tool to tell it what to do”.
You may want to read Build Systems à la Carte or watch the talk about it by Simon Peyton Jones (audio warning: it’s quite bad). Shake seems to be what you’re looking for, unfortunately you have to write the Shakefile in Haskell and have GHC installed which can be a bit steep as requirement.
Circling back to JS, I had a half idea to use the Shake model described in that paper to implement it in JS so I could replace Jake, which is a good tool but shares many of the problems that Make has.
remake has made a huge difference for me, in terms of making Makefiles far more debuggable.
Oh, remake sounds amazing, it was not on my radar, thanks!
I’ve said it before and I’ll say it again:
Every codebase benefits from a Makefile that enables minimally:
make deps check test build clean all
.deps
installs dependencies needed for development and building.check
runs static analysis tools like linters, etc. without compiling.test
runs tests.build
packages a distributable.clean
deletes any side effects ofcheck
,test
, orbuild
, andall
does everything all at once for a minimal “sanitary build environment -> ship it” process.Make, at least in the UNIX world, is the thing that everyone has installed. It is the greatest common divisor, perhaps. I’ve even got it working for Windows in a few projects, where the barrier to entry is “install Make somehow; we’ll take care of the rest.”
I’ve used Make in Ruby, Scala, Java, Groovy, Rust, Python, JavaScript, Pandoc+Markdown, PHP, and several other stacks. It always beats a build script and always solves the problem of onboarding faster than a README.
I think a rant2x may be warranted
Look, I like make, I’ve used make a bunch, I’ve even used it for web projects!
But I’m not sure the problem here was webpack. OP made the choice to use a symlink to share code, and that’s not an approach that this build system supports. (I think the supported alternative would be a third package; though I could also imagine this done with git submodules or something.)
To me this is a little like someone complaining
#include "https://example.com/some/code.h"
doesn’t work with gcc. Like, you’re right, and maybe it’d be nice if it did, but you’re not really using the system in the way that’s expected.The supported way is to create a package (which I did) and host your own local package repository. It’s absolutely mad. Gitsubmodules don’t fix anything here, the shared code has to live at the same level as the other 2 repositories. And yes, it’s just not webpack, it’s everything.
Could you expand on this? Ad I understand it, npm supports local paths and any git repo, including local repos. I don’t think you need to “host” anything to use these.
It’s true that local paths are not supported for published packages (how could they be?) but I think if you’re going to publish, presumably you’d publish the dependency, which would let you use the published name as the dependency.
I’m not sure what this means. I understand you encountered some frustration when attempting to solve a problem, and you found a solution that works for you. I’m glad! What’s not clear to me is that this is any kind of argument against webpack – or indeed for make. It seems to be an argument for “use tools that you like,” which could perhaps have been framed in a less inflammatory way.
All right, I guess you want to get into this, as if my experience is not evidence of the deficiency here, which is dismissive. :)
Yeah it’s inflammatory. I’m a human being with emotions. No it doesn’t need to be framed in any other way. It’s one tiny signal to the world that this stuff still sucks. This also isn’t world or business politics.
Yeah I’m not interested in getting into a flamewar. I’ve tried to understand your perspective, so I can come to a conclusion: are these tools fundamentally broken? Were they being misused? Is this an edge case that is deliberately unsupported? What are the tradeoffs involved in your decision? When would it be appropriate for me to make the same decision, and when should I avoid it?
You don’t seem to be interested in that conversation. You seem angry, and insistent that web tooling “sucks”. I disagree; though I tend to think any claim that some technology “sucks” is vacuous on its face anyway. Your story is not a useful data point to me, because you haven’t engaged with those questions that would make it so.
Same here, but there’s some condescending undertone here in everything you write, so quite frankly, you’re asking for it.
I’m interested in hearing about alternative solutions to the problem, not “HUH DID THEY EVEN USE IT RIGHT????”. See: people talking about ninja, build2, cmake, etc. Way more useful commentary.
More people should try Deno. It would wipe out at least two thirds of the frustrations this list.
“Starting a project” in Deno is “create a file ending with .ts”. It’s basically free.
Deno uses ESM for module resolution, so importing code from another project is just
import * from "../other_module/mod.ts";
Deno just runs .ts files, has debugger support, and stack traces pointing to your .ts source.
You don’t need workspaces if you don’t have a dependency chain of build steps or per-project
node_modules
.And if you’re building web projects, you can use light tools like esbuild to create a bundle which runs in the browser. Never looking back!
This comment definitely piqued my interest in using Deno as a toolchain for things, be they web or not.
I’m also starting to see too-complex build processes as mostly self-inflicted wounds. The pain stops when you take actions to do so and insist on tools that limit said complexity.
I’ve been using a pretty minimal setup lately with Ninja and esbuild. I still use tsc but for just for type checking. My builds are done faster than a
npm run
script can even start. The esbuild meta gives you the build graph and it’s straightforward to then generate depends likegcc -MD
. There’s a ‘ninja-builder’ package on NPM that makes building the rules really straightforward. Overall I’m really happy with the setup.