The “downsides” list is missing a bunch. I mean, I use Makefiles too, probably too much, but they do have some serious downsides, e.g.
The commands are interpreted first by make, then by $(SHELL), giving some awful escaping at times
If you need to do things differently on different platforms, or package things for distros, you pretty quickly have to learn autoconf or even automake, which adds greatly to the complexity (or reinvent the wheel and hope you didn’t forget some edge-case with DESTDIR installs or whatever that endless generated configure script is for)
The only way to safely (e.g. parallelizable) do multiple outputs is by using the GNU pattern match extension, which is extremely limited (rules with multiple inputs to multiple outputs is hard to write without lots of redundancy)
GNU make 4 has different features from macos (pre-GPL3) make 3.8 has different features from the various BSD makes
You really have to understand how make works to avoid doing things like possibly_failing_command | sed s/i/n/g > $@ (which will create $@ and trick make into thinking the rule succeeded because sed exited with 0 even though the first command failed). And do all your devs know how to have multiple goals that each depend on a temp dir existing, without breaking -j?
and there’s probably lots more. OTOH, make been very useful to me over the years, I know its quirks, and it’s available on all kinds of systems, so it’s typically the first thing I reach for even though I’d love to have something that solves the above problems as well.
Your additional downsides makes it sound like maybe the world needs a modern make. Not a smarter build tool, but one with less 40-year-old-Unix design sensibilities: a nicer, more robust language; a (small!) handful of missing features; and possibly a library of common functionality to limit misimplementations and cut down on the degree to which every nontrivial build is a custom piece of software itself.
I think the same approach as Oil vs. bash is necessary: writing something highly compatible with Make, separating the good parts and bad parts, and fixing the bad parts.
Most of the “make replacements” I’ve seen make the same mistake: they are better than Make with respect to the author’s “pet peeve”, but worse in all other dimensions. So “real” projects that use GNU Make like the Linux kernel and Debian, Android, etc. can’t migrate to them.
To really rid ourselves of Make, you have to implement the whole thing and completely subsume it. [1]
I wrote about Make’s overlap with shell here [2] and some general observations here [3], echoing the grandparent comment – in particular how badly Make’s syntax collides with shell.
I would like for an expert in GNU Make to help me tackle that problem in Oil. Probably the first thing to do would be to test if real Makefiles like the ones in the Linux kernel can be statically parsed. The answer for shell is YES – real programs can be statically parsed, even though shell does dynamic parsing. But Make does more dynamic parsing than shell.
If there is a reasonable subset of Make that can be statically parsed, then it can be converted to a nicer language.
In particular, you already have the well-tested sh parser in OSH, and parsing Make’s syntax 10x easier that. It’s basically the target line, indentation, and $() substitution. And then some top level constructs like define, if, include, etc.
One way to start would be with the “parser” in pymake [4]. I hacked on this project a little. There are some good things about it and some bad, but it could be a good place to start. I solved the problem of the Python dependency by bundling the Python interpreter. Although I haven’t solved the problem of speed, there is a plan for that. The idea of writing it in a high-level language is to actually figure out what the language is!
The equivalent of “spec tests” for Make would be a great help.
We need a modern make, not make-style tools. It needs to be mostly compatible so that someone familiar with make can use “modern make” without learning another tool.
Do most of these downsides also apply to the alternatives?
The cross platform support of grunt and gulp can be quite variable. Grunt and gulp and whatnot have different features. The make world is kinda fragmented, but the “not make” world is pretty fragmented, too.
My personal experience with javascript ecosystem is nil, but during my foray into ruby I found tons of rakefiles that managed to be linux specific, or Mac specific, or whatever, but definitely not universal.
I recommend looking at BSD make as its own tool, rather than ‘like gmake but missing this one feature I really wanted’. It does a lot of things people want without an extra layer of confusion (automake).
Typical bmake-only makefiles rarely include shell script fragments piping output around, instead they will use ${VAR:S/old/new} or match contents with ${VAR:Mmything*}. you can use ‘empty’ (string) or (file) ‘exists’.
Deduplication is good and good mk fragments exist. here’s an example done with bsd.prog.mk. this one’s from pkgsrc, which is a package manager written primarily in bmake.
Hey! Original author here :). Thanks a bunch for this feedback. I’m pretty much a Make noob still, so getting this type of feedback from folks with more experience is awesome to have!
You really have to understand how make works to avoid doing things like possibly_failing_command | sed s/i/n/g > $@ (which will create $@ and trick make into thinking the rule succeeded because sed exited with 0 even though the first command failed).
Two things you need to add to your Makefile to remedy this situation:
SHELL := bash -o pipefail. Otherwise, the exit status of a shell pipeline is the exit status of the last element of the pipeline, not the exit status of the first element that failed. ksh would work here too, but the default shell for make, /bin/sh, won’t cut it – it lacks pipefail.
Finally, for total safety you’d want make to write to .$@.$randomness.tmp and use an atomic rename if the rule succeeded, but afaik there’s no support in make for that.
So yes, “you really have to understand how make works [to avoid very problematic behavior]” is an accurate assessment of the state of the things.
Your temp directories dependency problem makes me think a GUI to create and drag drop your rules around could be useful. It could have “branching” and “merging” steps that indicate parallelism and joining too.
No. It lacks proper abstractions for semi-decent build system, and has lots of questionable features (see how large GNU make docs are https://www.gnu.org/software/make/manual/html_node/index.html) inconsistent in multiple implementations. If you need something similar to Make, use Ninja.
I see the main advantage of Make in being a common interface for building, testing and running projects written in different languages, e.g. github uses a set of shell scripts for this use case.
If a project provides a Makefile then I don’t have to remember how to call npm, mvn, gradle, cargo , go ... or whatever there is, thus it makes my life as a developer a bit easier.
Maybe Make has been perceived as a C/C++ build tool by the JS community? Seeing the GNUstep make-based build system convinced me it can be used to create clean, expressive, declarative builds.
Yes, the above example was for building a very small project. But just because the project becomes larger, doesn’t mean the user experience of Make diminishes.
Historically, no large project’s Makefile agrees with this assertion. Make scales very poorly to complex build scenarios, and once you need to build on platforms with any differences between them, it’s an absolute shitshow.
This is an assertion that could benefit from more precision. OpenBSD is a fairly large and complex project, runs on many platforms with quite some differences, and uses make pretty much exclusively for building.
It calls clean when you run “make build”, but you can also just run make. You’re right it shouldn’t be necessary, but there’s not much inclination to fix it because the point of make build is “from scratch”.
A bigger complaint I have is that recursive make slows down parallel builds when it gets to one source file utilities.
My only real use-cases for make have been rendering LaTeX to PDF, and for building my static site. These days I’ve migrated all of my make-using projects to Nix; whilst Nix is mostly used as a packaging tool which invokes make and friends, I’ve found its simple, clean DSL to be much nicer than the mess of escaping that I end up needing with make.
The nice thing about make (vs some to other arguably better tool that needs to be installed) is that it comes installed just about everywhere.
I’ve been using Makefiles for my Go projects. They are simple, just wrapping gb as well as some build dependencies (ie golang and gb). so I don’t need to worry too much about portability and since make comes installed my projects are eas(y:ier) to bootstrap.
I’ll have to checkout some of the options mentioned here, especially ninja and cmake… But typing make is very hardwired into my muscle memory ;)
Sure, but it doesn’t have visual studio installed by default either. I would prefer something like cmake however that can create either a make file or visual studio project (And more)
Hey all! Original author here! Flattered this blog post found it’s way for discussion on this site. In fact, I’m ashamed to admit I wasn’t familiar with this site before. Seeing this comment thread (and other articles being posted), I love what I’m seeing. There is some great conversation here that will help me clear up my own ignorance and build on the knowledge of folks who have much more experience with this tooling for me. So thanks again :D.
You can see the easy integration into existing tools like tsc and npm. I didn’t need to wait until a wrapper was created (or to create my own wrapper) in a code-based build tool.
You now have zero documentation of which versions of those tools you were using, or even which tools have to be installed at all. There’s actually a good reason for those “wrappers”.
Declarative and incremental builds are a good idea, but make isn’t them; it can’t describe things that should happen, only command lines to run, and that only if the intermediate states are completely represented by files on the filesystem.
I have been having a lot of success using Make to build my Docker-based microservices ecosystem of about a dozen services and about half a dozen backend servers. I also use Make to remember how to build and tag my containers to push to AWS ECR for deployment, which I would never remember how to do if I had to type it out every day.
You should probably be using Terraform for this, though the learning curve is a bit steep at first. Well worth it though to coordinate distributed infrastructure idempotently.
Since nobody else pointed it out: the argument for make in this article seems to be merely that it’s old. I’m all for learning from history, but blindly using shit just because it’s old doesn’t seem like an improvement compared to being ignorant of the past. Both are different flavors of cargo-culting. Getting past cargo-culting requires a more nuanced synthesis of old and new, an awareness of past tools and an attempt to understand what each tried to achieve, and what wrong turns they took in their design. And OP doesn’t even mention ant or ninja or cmake!
I think if this guy said, “Enough with the dumb javascript build tools already, just use something that already exists,” his point would have come across much better. Nothing wrong with make in my opinion, but I can’t stand learning a new weird JS library to write build scripts.
I use Makefiles for every Python project, personal and professional. Maybe there’s a better way, but it works for me
and my teammates quite well. Here’s a redacted example from a recent one:
So now all common tasks are at your fingertips, make env, make setup, make lint, make test and bumping the version with make version. Usually have a bunch of other stuff in there too once the project gets going, maybe make server and stuff like that.
with a main.c file in the same directory, and make just says “No rule to target ‘main.c.o’, needed by ‘foo’”. Your comment didn’t specify how that syntax works, and I always just use the % syntax anyways, so I’m curious how that syntax is supposed to work.
The “downsides” list is missing a bunch. I mean, I use Makefiles too, probably too much, but they do have some serious downsides, e.g.
$(SHELL)
, giving some awful escaping at timespossibly_failing_command | sed s/i/n/g > $@
(which will create$@
and trick make into thinking the rule succeeded becausesed
exited with 0 even though the first command failed). And do all your devs know how to have multiple goals that each depend on a temp dir existing, without breaking-j
?and there’s probably lots more. OTOH, make been very useful to me over the years, I know its quirks, and it’s available on all kinds of systems, so it’s typically the first thing I reach for even though I’d love to have something that solves the above problems as well.
Your additional downsides makes it sound like maybe the world needs a modern
make
. Not a smarter build tool, but one with less 40-year-old-Unix design sensibilities: a nicer, more robust language; a (small!) handful of missing features; and possibly a library of common functionality to limit misimplementations and cut down on the degree to which every nontrivial build is a custom piece of software itself.mk?
i’ve also thought of that! for reference: https://9fans.github.io/plan9port/man/man1/mk.html
I think the same approach as Oil vs. bash is necessary: writing something highly compatible with Make, separating the good parts and bad parts, and fixing the bad parts.
Most of the “make replacements” I’ve seen make the same mistake: they are better than Make with respect to the author’s “pet peeve”, but worse in all other dimensions. So “real” projects that use GNU Make like the Linux kernel and Debian, Android, etc. can’t migrate to them.
To really rid ourselves of Make, you have to implement the whole thing and completely subsume it. [1]
I wrote about Make’s overlap with shell here [2] and some general observations here [3], echoing the grandparent comment – in particular how badly Make’s syntax collides with shell.
I would like for an expert in GNU Make to help me tackle that problem in Oil. Probably the first thing to do would be to test if real Makefiles like the ones in the Linux kernel can be statically parsed. The answer for shell is YES – real programs can be statically parsed, even though shell does dynamic parsing. But Make does more dynamic parsing than shell.
If there is a reasonable subset of Make that can be statically parsed, then it can be converted to a nicer language. In particular, you already have the well-tested sh parser in OSH, and parsing Make’s syntax 10x easier that. It’s basically the target line, indentation, and $() substitution. And then some top level constructs like define, if, include, etc.
One way to start would be with the “parser” in pymake [4]. I hacked on this project a little. There are some good things about it and some bad, but it could be a good place to start. I solved the problem of the Python dependency by bundling the Python interpreter. Although I haven’t solved the problem of speed, there is a plan for that. The idea of writing it in a high-level language is to actually figure out what the language is!
The equivalent of “spec tests” for Make would be a great help.
[1] https://lobste.rs/s/ofu5yh/dawn_new_command_line_interface#c_d0wjtb
[2] http://www.oilshell.org/blog/2016/11/14.html
[3] http://www.oilshell.org/blog/2017/05/31.html
[4] https://github.com/mozilla/pymake
Several more modern
make
style tools exists - e.g. ninja, tu and redo.We need a modern make, not make-style tools. It needs to be mostly compatible so that someone familiar with make can use “modern make” without learning another tool.
I think anything compatible enough with
make
to not require learning the new tool would find it very hard to avoid recreating the same problems.The world does, but
s/standards/modern make replacements/g
Do most of these downsides also apply to the alternatives?
The cross platform support of grunt and gulp can be quite variable. Grunt and gulp and whatnot have different features. The make world is kinda fragmented, but the “not make” world is pretty fragmented, too.
My personal experience with javascript ecosystem is nil, but during my foray into ruby I found tons of rakefiles that managed to be linux specific, or Mac specific, or whatever, but definitely not universal.
I recommend looking at BSD make as its own tool, rather than ‘like gmake but missing this one feature I really wanted’. It does a lot of things people want without an extra layer of confusion (automake).
Typical bmake-only makefiles rarely include shell script fragments piping output around, instead they will use ${VAR:S/old/new} or match contents with ${VAR:Mmything*}. you can use ‘empty’ (string) or (file) ‘exists’.
Deduplication is good and good mk fragments exist. here’s an example done with bsd.prog.mk. this one’s from pkgsrc, which is a package manager written primarily in bmake.
Hey! Original author here :). Thanks a bunch for this feedback. I’m pretty much a Make noob still, so getting this type of feedback from folks with more experience is awesome to have!
Two things you need to add to your Makefile to remedy this situation:
SHELL := bash -o pipefail
. Otherwise, the exit status of a shell pipeline is the exit status of the last element of the pipeline, not the exit status of the first element that failed.ksh
would work here too, but the default shell for make,/bin/sh
, won’t cut it – it lackspipefail
..DELETE_ON_ERROR:
. This is a GNU Make extension that causes failed targets to be deleted. I agree with @andyc that this behavior should be the default. It’s surprising that it isn’t.Finally, for total safety you’d want
make
to write to.$@.$randomness.tmp
and use an atomic rename if the rule succeeded, but afaik there’s no support inmake
for that.So yes, “you really have to understand how make works [to avoid very problematic behavior]” is an accurate assessment of the state of the things.
Your temp directories dependency problem makes me think a GUI to create and drag drop your rules around could be useful. It could have “branching” and “merging” steps that indicate parallelism and joining too.
The first thing you learn about make is that there is no make. There is a bunch of makes, all with their own features and quirks.
No. It lacks proper abstractions for semi-decent build system, and has lots of questionable features (see how large GNU make docs are https://www.gnu.org/software/make/manual/html_node/index.html) inconsistent in multiple implementations. If you need something similar to Make, use Ninja.
I see the main advantage of Make in being a common interface for building, testing and running projects written in different languages, e.g. github uses a set of shell scripts for this use case. If a project provides a
Makefile
then I don’t have to remember how to callnpm
,mvn
,gradle
,cargo
,go ...
or whatever there is, thus it makes my life as a developer a bit easier.Maybe Make has been perceived as a C/C++ build tool by the JS community? Seeing the GNUstep make-based build system convinced me it can be used to create clean, expressive, declarative builds.
Historically, no large project’s Makefile agrees with this assertion. Make scales very poorly to complex build scenarios, and once you need to build on platforms with any differences between them, it’s an absolute shitshow.
This is an assertion that could benefit from more precision. OpenBSD is a fairly large and complex project, runs on many platforms with quite some differences, and uses make pretty much exclusively for building.
I do have one minor jab to make at OpenBSD’s use of
makefile
s. (footnote 3) Reactions/corrections appreciated.It calls clean when you run “make build”, but you can also just run make. You’re right it shouldn’t be necessary, but there’s not much inclination to fix it because the point of make build is “from scratch”.
A bigger complaint I have is that recursive make slows down parallel builds when it gets to one source file utilities.
Cheerfully amended to: Only one large project
in the default install, in a heck of a long time!(Although recursive make considered harmful, and all that)
My only real use-cases for make have been rendering LaTeX to PDF, and for building my static site. These days I’ve migrated all of my make-using projects to Nix; whilst Nix is mostly used as a packaging tool which invokes make and friends, I’ve found its simple, clean DSL to be much nicer than the mess of escaping that I end up needing with make.
The nice thing about make (vs some to other arguably better tool that needs to be installed) is that it comes installed just about everywhere.
I’ve been using Makefiles for my Go projects. They are simple, just wrapping gb as well as some build dependencies (ie golang and gb). so I don’t need to worry too much about portability and since make comes installed my projects are eas(y:ier) to bootstrap.
I’ll have to checkout some of the options mentioned here, especially ninja and cmake… But typing make is very hardwired into my muscle memory ;)
Except for the OS used by almost 50% of developers.
Sure, but it doesn’t have visual studio installed by default either. I would prefer something like cmake however that can create either a make file or visual studio project (And more)
Hey all! Original author here! Flattered this blog post found it’s way for discussion on this site. In fact, I’m ashamed to admit I wasn’t familiar with this site before. Seeing this comment thread (and other articles being posted), I love what I’m seeing. There is some great conversation here that will help me clear up my own ignorance and build on the knowledge of folks who have much more experience with this tooling for me. So thanks again :D.
You now have zero documentation of which versions of those tools you were using, or even which tools have to be installed at all. There’s actually a good reason for those “wrappers”.
Declarative and incremental builds are a good idea, but make isn’t them; it can’t describe things that should happen, only command lines to run, and that only if the intermediate states are completely represented by files on the filesystem.
I have nothing to add to this, but I want to highlight it and second it.
I have been having a lot of success using Make to build my Docker-based microservices ecosystem of about a dozen services and about half a dozen backend servers. I also use Make to remember how to build and tag my containers to push to AWS ECR for deployment, which I would never remember how to do if I had to type it out every day.
You should probably be using Terraform for this, though the learning curve is a bit steep at first. Well worth it though to coordinate distributed infrastructure idempotently.
Since nobody else pointed it out: the argument for
make
in this article seems to be merely that it’s old. I’m all for learning from history, but blindly using shit just because it’s old doesn’t seem like an improvement compared to being ignorant of the past. Both are different flavors of cargo-culting. Getting past cargo-culting requires a more nuanced synthesis of old and new, an awareness of past tools and an attempt to understand what each tried to achieve, and what wrong turns they took in their design. And OP doesn’t even mentionant
orninja
orcmake
!Oh, also: those who don’t understand the limitations of recursive
makefile
s are doomed to succumb to their siren song.I think if this guy said, “Enough with the dumb javascript build tools already, just use something that already exists,” his point would have come across much better. Nothing wrong with make in my opinion, but I can’t stand learning a new weird JS library to write build scripts.
I use Makefiles for every Python project, personal and professional. Maybe there’s a better way, but it works for me and my teammates quite well. Here’s a redacted example from a recent one:
So now all common tasks are at your fingertips,
make env
,make setup
,make lint
,make test
and bumping the version withmake version
. Usually have a bunch of other stuff in there too once the project gets going, maybemake server
and stuff like that.The only “build tool” I find even worth looking at besides make is
redo
. And only if you’re sure POSIX make won’t doPOSIX make very frequently won’t do. Take this for example:
In POSIX make,
%
doesn’t work like that, that’s a GNU extension.What is the advantage of that over standard double suffix rules? Such as this one that is built-in and specified by posix:
Personally, I dislike cluttering the source directory with a bunch of object files. My Makefiles therefore generally contain something like this:
Also, I’m not entirely sure how your example should work; I made a Makefile which looks like this:
with a
main.c
file in the same directory, and make just says “No rule to target ‘main.c.o’, needed by ‘foo’”. Your comment didn’t specify how that syntax works, and I always just use the%
syntax anyways, so I’m curious how that syntax is supposed to work.In fact, this rule is even in make by default without you needing to specify it :)
I’m pretty sure all the javascript build tools were made by windows developers who didn’t have make at the time.
tbh, I hate make and prefer a hand written ninja file atm.
Or if they had make, it didn’t work well or easily on windows.