1. 43

  2. 17

    Those who do not learn Make are doomed to reinvent it.

    Any project that grows in size is bound to outgrow Make but it’s a fantastic, ubiquitous starting point for simple things.

    1. 16

      I don’t think Make is fit for any particular purpose.

      • If you need a simple task launcher, use bash
      • If you need a working import system, use a language that has a working import system :-) or at least use a tool that has better dependency analysis than mtime.

      Make is not actually a low common denominator (that’s sh, which Make requires). It’s just very familiar to a lot of people as a bad import system, so it gets reused as a bad task launcher, but really you shouldn’t use it as either.

      1. 24

        Unless you need to … you know … make something, then it’s actually pretty effective. Of course if you just consider it for tasks it wasn’t make for it isn’t very fit, but if what you need is a build system not a task launcher it has a lot going for it.

        1. 12

          It falls apart beyond simplest cases. If you need to make something that isn’t 1:1 mapping between files: now you have % rules, sentinel files, and recursive makefiles. If you need to configure what you’re building, you need GNU extensions, or external scripts, or makefile generators. If the build artifacts depend on environment, correctly rebuilding on environment change is nearly hopeless, and generally you accept it will be broken and require make clean from time to time.

          1. 3

            Yeah, it’s not a good enough make system. Mtime sucks. A good make system should have content hashing at the least and probably a proper system for effects on top of that.

          2. 10

            Except if the things you need to make:

            • Have single build steps that produce more than one output,
            • Have dependencies that are dynamically discovered during a build step,
            • Have outputs whose freshness is not simply a function of the timestamp, or
            • Have build steps that may need to be run more than once to reach a fixed point.

            For example, Make cannot build LaTeX documents well. You need to run latex first to generate all of the cross-reference targets, then typically bibtex once to generate the bibliography with the targets for all of the cross-references, then latex once more to generate the cross references. The second latex invocation may generate more unresolved references (and more reference targets) and so you may need to rerun it. The latexmk tool can do this by parsing the output of the various tools to determine whether they need to be rerun, but make cannot, by itself.

            You can drive latexmk from make but by the time it’s run for the first time the rest of the dependency tree in make‘s view of the world is computed and so if the first run of pdflatex tells you that you need a PDF file and make has a build rule that can construct a PDF from, say, an SVG file, then there’s no way for the build rule that invoked latexmk to add the dependency without resorting to hacks like modifying a file that is included by make (using either a non-standard GNU or a different non-standard BSD extension) and then re-invoking make with the top-level target.

            1. 1

              I agree with your points. Is there any system that you would recommend as an alternative to make?

              1. 1

                Make’s DAG starts from outputs and works towards their inputs. For am example of a build system which works in reverse, I’d look at Tup, which I don’t use in Amy practical sense but found to have a lot of interesting ideas: stateful-ness, no PHONY targets, input->output DAG, and a FUSE filesystem that it uses to track build dependencies by actively monitoring a build’s disk activity!

                Deal breaker for me was that, even with the Lua interface, it wasn’t very flexible in regards to directory structure: working with multiple Tupfiles or trying to write one that’s recursive was restrictive. But props to it for having an intended use case and sticking to it (see no PHONY targets).

          3. 13

            make is for making files – any files, with arbitrarily complex requirements.

            make is not and was never designed to run tasks (except for tasks which exist to make files). It can be used that way, with .PHONY targets, for example, but just or shell scripts are better suited.

            make does what it was meant to do pretty well. We run into problems with it often because we are trying to force it do everything else.

            1. 2

              If it doesn’t output a file you just touch $@. In that case make works great if what you need is an execution plan.

            2. 7

              I think this misses a big point. When you say use a language that has a working import system, I take it you mean rather than use Make to build software. And I agree: Building software is no longer Make’s strong point, because more specialized tools exist. Make makes you write custom rules for everything, after all. But that’s the difference: Make is not actually a build system, it’s a language! That makes it universal – useful for any purpose (other than those particular ones).

              For example, I’ve used Make to evaluate different video and image encoders on different input. This is an attractive use case for Make, because of dependency chains (encoding, decoding, analysis), easily expressible combinatorial test matrix and implicit parallelism. As for its mtime-based dependency tracking, it’s flawless and not a limiting factor.

              1. 5

                Make is not actually a build system, it’s a language!

                But is it any good as a language? FWIW I looked at Make as a language several years ago

                and then I wrote 3 substantial makefiles from scratch, and came to the conclusion that it’s a bad language.

                Shell is a much better language that can be “rescued” with some good practices. But I don’t think Make can be rescued (one reason is that it implicitly shells out so much!).

                Have you tried Ninja? For the video encoding use case it might work great. I described how I used it here: https://news.ycombinator.com/item?id=32307188

                1. 4

                  Oh, Make is indeed a flawed and broken language to its core. That’s the real critique! But has anyone tried to make a proper once-and-for-all Make killer successor language (beyond just a command runner) that’s as expressive (not just meant to be generated)?

                  I’m not talking about tabs: That’s but a syntactic surprise, not a flaw of the kind that makes it impossible to use correctly. For that, one only needs to think of what flaw it shares with the POSIX shell: Lack of a proper list datatype. That’s impressively broken for an implicitly parallel language whose reason to exist is to express relations between files in plural. I also agree about the implicit shelling out. And recursive Make needs to be fundamentally rethought if correctness is to be preserved.

                  I haven’t actually tried writing Ninja, although I use it more than Make these days (as a CMake/Meson backend). I know it can express things that Make can’t, such as rules with multiple outputs, but I get the impression that it’s not as expressive and is really meant to be generated. I would rather write 1 generic rule in Make than N special rules in Ninja.

                  1. 1

                    Yup, all agreed. But you should try generating Ninja :) Ninja is a good replacement for Make because most people generate Make anyway – like autotools does, CMake does, and kconfig does for the Linux kernel.

                    Make isn’t powerful enough on its own. It lulls you into thinking you can use one tool, and then you end up with a mess :)

                    I just copied Ninja’s own 197 line wrapper API, which makes it perfectly reasonable to do in Python:


                    There are basically only two functions to learn – rule() and build(), which generate tne rule and build statements in Ninja. That’s it :) You can learn Ninja in 20 minutes. A typical way I use it is something like this:

                    for compiler in ['cxx', 'clang']:
                      for variant in ['dbg', 'opt', 'asan', 'coverage']:
                        out = f'_obj/{compiler}-{variant}/foo.o'
                        v= [('compiler', compiler), ('variant', variant)]
                        n.build(out, 'compile-one', ['foo.c'], variables=v)

                    (not tested)

                    So it’s a simple nested loop that generates a build rule on every iteration.

                    It’s trivial in Python, but doing that in GNU make is HUGE hassle!

                    And this is extremely useful because you need clang and ASAN/UBSAN to help with your C code :)

                    I hope to write a blog post about this, because it’s come up a lot lately …

                    The generation can make more sense if you think of it as a “staged execution model” – you’re using imperative code to generate a parallel graph. That’s how GPU code and AI code works too. (And it’s how Bazel works too – it “lowers” your code to a target graph)


                    This model is how the recent Hay feature of Oil works, and I explicitly mentioned CMake/Ninja there:


                    The generator / graph split can make sense for other reasons too. One thing I am going to do is experiment with ./NINJA_config.p –sandbox=X , which may help accomplish what the recent “Landlock make” is doing: https://news.ycombinator.com/item?id=32377264

                    1. 1

                      you’re using imperative code to generate a parallel graph. That’s how GPU code and AI code works too.

                      So you’re saying we should use the GPU to run parallel builds? Only semi joking.

                      1. 1

                        Would be nice :)

                        But yeah it’s fine-grained instruction-level GPU parallelism vs. coarsed-grained process-level parallelism

                        You do see the same thing in AI frameworks – they have both fine-grained and coarsed-grained parallel execution models

                    2. 1

                      But has anyone tried to make a proper once-and-for-all Make killer successor language (beyond just a command runner) that’s as expressive (not just meant to be generated)?

                      Yes, we are attempting exactly that in build2. Can’t guarantee we will achieve the “once-and-for-all” part, but we (and a growing number of users) are finding it quite productive for handling complex projects (e.g., Boost, Qt) without having to generate the buildfiles.

              2. 7

                Use of tools that were still wet was part of the culture.

                But it hardened incredibly quickly. Why do Makefile directives have to start with a tab? It was an easy hack and by the time the author realized it was a bad idea, dozens of people where reliant on it!

                Tabs and Makefile (2015)

                1. 3

                  I interpreted that claim about “use of tools that were still wet” as an admission of this very fact; saying that it still had a very bad idea in it (tabs) but people used it anyway.

                  1. 1

                    You don’t have to use tabs. Just override .RECIPEPREFIX

                    1. 2

                      Now I’m itching to write a makefile using emojis instead of tabs. Or one of those weird non visible Unicode characters.

                  2. 7

                    I’ve been talking a lot about make lately, it seems, and I kinda feel sometimes like I’m a “Make developer” as much as some folks call themselves “Java developer” or “Go developer” these days because I’ve spent so much time making my codebases easier to use by using Make as the interface for both one-off tasks without dependencies or file generation (“phony” in Make parlance) as well as for the textbook use with output files and dependent input files.

                    Just on Lobsters alone:

                    1. 3

                      At my last job I worked with the developer of the Procfile format: https://devcenter.heroku.com/articles/procfile

                      I asked him a few times to explain what Procfiles were for that couldn’t be done equally well with existing tooling using Makefiles and phony targets.

                      Never got a straight answer. And now we’re stuck with yet another Filefile entry: https://github.com/cobyism/Filefile

                      1. 2

                        :) I find phony targets to be not as simple as a Procfile (which I’ve accidentally memorized). Makefile I think fits nicely with make just as Procfile fits nicely with ps if you can imagine that heroku is starting a process for you. But it’s a valid point, I’m not arguing. They do similar things in different ways. Most of Heroku’s “ahead of its time”-ness was on the massive effort to detect your app. I kind of accepted touching an empty Procfile while I was amazed at what Heroku was doing near launch.

                        The Filefile thing is funny and I starred that repo way back when. It’s odd to see them all collected up even if some of those tools aren’t as recognizable anymore. It’s rare for a project to have all those things in the root and they’d be broken up by a similar troupe of cargo.lock, yarn.lock, package-lock.json, Gemfile.lock, poetry.lock.

                        1. 2

                          I asked him a few times to explain what Procfiles were for that couldn’t be done equally well with existing tooling using Makefiles and phony targets

                          1. Have a target named web and a web process type. You’d at least need 2 separate Makefiles then.

                          2. As far as I can tell you’d have to implement a Makefile parser to implement the scaling UI. There might be a way to get make(1) to tell you the phony targets but if so it is not obvious to me how. That seems awfully complicated for the use case.

                          1. 1

                            There might be a way to get make(1) to tell you the phony targets but if so it is not obvious to me how.

                            You would use the same mechanism that bash uses to determine tab-completions.

                            Have a target named web and a web process type. You’d at least need 2 separate Makefiles then.

                            Seems like using a sledgehammer to squash a fly if you ask me.

                            1. 1

                              You would use the same mechanism that bash uses to determine tab-completions.

                              I wasn’t aware that only completed phony targets?

                              Seems like using a sledgehammer to squash a fly if you ask me.

                              Makefiles have power in excess of what Procfiles are used for. There are no dependencies, no patterns, no inclusion, etc.

                              A well known directory (call it “procs”, or hell “bin”) with executables in it would be a vastly simpler replacement for Procfiles than a Makefile.

                              1. 1

                                A well known directory (call it “procs”, or hell “bin”) with executables in it would be a vastly simpler replacement

                                TBH I first asked him why make the Procfile over a bin directory and never got a straight answer for that either.

                          2. 1

                            I thought Procfile was more for running a number of services together. I also thought it came from Foreman, but perhaps I’m wrong about that?


                          3. 1

                            I wonder if you’ve considered shell for those tasks, and if so what the pros/cons you see are?

                            If you have a bunch of shell commands you want to put in a single file, I call that the “Taskfile” pattern, and funnily enough you can use either make or shell for “Task files”.


                            In shell, I just use a function for each task:

                            build() {
                              ./setup.py --quiet build_ext --inplace      
                            test() {
                               for t in *_test.py; do

                            or you can also use a case statement.

                            Here are some pros and cons of make vs. shell I see:

                            • With make, on most distros you get shell autocompletion of targets. I have a bash completion script to do the same thing with shell functions, but most people don’t have it.
                            • Make automatically dispatches to the “verb” – with shell you need a bit of extra code. (I use "$@" but it doesn’t provide great error messages.)
                            • In shell you have to remember “unofficial strict mode” to get good error handling. I think Make has similar problems, although maybe the default is slightly safer.

                            Make downsides:

                            • Syntax conflicts because Make embeds shell:
                              • $(VAR) conflicts with shell variables, so for shell variables you need something like $$my_shell_var
                              • How do you write a for loop? You have to put an extra \ at the end of every line in a Makefile, which is very ugly
                            • You have to remember to make everything .PHONY; otherwise it’s a subtle bug

                            Any others? I prefer shell but I can see why people choose Make. Really it is the language “mashup” that really bothers me :-/

                            BTW I take Task files to an extreme and Oil has over 10,000 lines of shell automation


                            e.g. to generate all the tests, benchmarks, and metrics here: https://www.oilshell.org/release/0.12.3/quality.html

                            Related story from 5 months ago: https://lobste.rs/s/lob0rw/replacing_make_with_shell_script_for

                            Other comment: https://news.ycombinator.com/item?id=23195313 – OK another issue is listing the “tasks”, which is related to the autocompletion issue. Make doesn’t have this by default either

                            1. 1

                              Make is a tool for managing a directed acyclic graph of commands. So I’m not sure why you’d compare it to bash. Make is a wrapper for bash lines that defines the relationships between your bash code.

                              1. 1

                                I understand that theory, but that’s not what what the OP is talking about using for. Look at one comment he linked:


                                Those are six .PHONY verbs, not nouns. Even build is a verb.

                                So they’re using make as a “Task runner” (verbs), not as a build system (to “demand” nouns). (FWIW Oil’s Ninja build has hundreds of nouns, mainly for tests and several build variants: ASAN, UBSAN, coverage, etc.)

                                Make isn’t great as a build system for all the reasons discussed here and many other places: https://lobste.rs/s/sq9h3p/unreasonable_effectiveness_makefiles#c_v7pkr0

                                As mentioned, I wrote 3 Makefiles from scratch starting from 2017 and concluded it was a big mistake (and I’m still maintaining them).

                                1. One was rewriting Python’s build system from scratch (which is still being used by Oil today, but needs to go away)
                                2. Apache-style Log analysis (since I don’t use Google Analytics or any hosted service). Requires dynamic/globbed rules.
                                3. Building the blog (which is surprisingly speed sensitive). Requires dynamic/globbed rules.

                                For those use cases (and I think most), Python/Ninja is way better, and similar to Bazel, but much lighter. The sandboxing like Landlock Make would be great though – that is a real problem.

                                I think you forked Make and made it a good build system for your use cases, but that doesn’t mean it’s good in general :)

                                1. 1

                                  Where is your Python Makefile? If your effort to write a Makefile for Python didn’t work out, then that doesn’t make it it’s Make’s fault. There were probably just some things you failed to consider. I wrote a Makefile for Python about a year ago .https://github.com/jart/cosmopolitan/blob/master/third_party/python/python.mk If I build Cosmopolitan, then rm -rf o//third_party/python and then time make -j16 o//third_party/python then it takes 17 seconds to compile Python and run the majority of its tests. The build is sandboxed and pledged. It doesn’t do things like have multiple outputs. We removed all the code that does things like communicate with the Internet while tests are running.

                                  1. 1

                                    It starts here: https://github.com/oilshell/oil/blob/master/Makefile

                                    It definitely works, but doesn’t do all the stuff I want.

                                    Where Make falls down is having any kind of abstraction or reuse. That seems to show in your lengthy ~4500 line Makefile. (If it works for you, great, but I wouldn’t want to maintain the repetition! )

                                    1. For the Python makefile, I want to build oil.ovm, opy.ovm, hello.ovm, i.e. three different apps. I’m using the % pattern for that. If I want to add a second dimension like ASAN/UBSAN/coverage, in my experience that was difficult and fragile

                                    2. For the log analysis, I want to dynamically create rules for YYYY-MM-DD-accesslog.tar.gz.

                                    3. For the blog, I want to dynamically make rules for blog/*/*/*.md, and more.

                                    That pattern interacts poorly with the all the other features of Make, including deps files with gcc -M, build variants, etc.

                                    In contrast, it’s trivial with a script generating Ninja.

                                    But I don’t think even those use cases necessary to justify my opinion, there are dozens of critiques of Make that are 10 years old and based on lots of experience. And comments like this right in the thread:


                                    I have looked quite deeply into Make, and used it on a variety of problems, so I doubt it will change my mind, e.g.


                                    Remember when I say “make is a bad language”, this is coming from the person who spent years reimplementing most of bash and more :) i.e. I don’t really have any problem with “bad” or “weird” or “string-ish” languages, at least if they have decent / salvageable semantics. And I don’t think Make does for MANY common build problems.

                                    The sandbox/pledge stuff you added to Make is very cool, and I would like something like that, and hopefully will get time to experiment with it.

                                    1. 1

                                      For the Python makefile, I want to build oil.ovm, opy.ovm, hello.ovm, i.e. three different apps. I’m using the % pattern for that. If I want to add a second dimension like ASAN/UBSAN/coverage, in my experience that was difficult and fragile

                                      Consider using o/$(MODE)/%.o: %.c pattern rules. Then you can say make MODE=asan.

                                      Remember when I say “make is a bad language”, this is coming from the person who spent years reimplementing most of bash and more :)

                                      I don’t doubt you’re an expert on shells. Being good at shells is a different skillset from directed acyclic graphs.

                                      That seems to show in your lengthy ~4500 line Makefile. (If it works for you, great, but I wouldn’t want to maintain the repetition!

                                      If by repetition you mean my makefile code is unfancy and lacks clever abstractions, then I’ll take it as a compliment. Python has 468,243 lines of code. I’m surprised it only took me 4k lines of build code to have a fast parallelized build for it that compiles and runs tests in 15 seconds. Even the Python devs haven’t figured that out yet, since their build takes more like 15 minutes. I believe having fast build times with hermeticity guarantees is more important than language features like rich globbing, which can make things go slow.

                          4. 6

                            Everyone in the comments so far seems to be accepting without question the claim that using tabs for recipes was a mistake. I push back on that opinion pretty hard. I think from a language design perspective, using tabs makes a whole lot of sense.

                            I feel that much of the anger toward tabs is misdirected runoff from a legitimate complaint about make, its very poor error messages:

                            makefile:2: *** missing separator.  Stop.

                            I don’t think there’s any excusing this. Distinct characters for recipe lines should make it easier to print helpful messages. Just tell me that I have a line which begins with a space.

                            Anyway, as is often the case, modern tooling makes make much easier to work with:

                            • syntax highlighting will show me when I have a problem line before even running it.
                            • editors can automatically handle working with tab characters (even invisibly to the programmer, if you want).
                            • if I really hated invisible characters, I could render tabs in a variety of ways.

                            My expressed opinion here is that the drawbacks of the tab character are exaggerated, and are outweighed by the benefits. Indentation is what the \t character exists for (and nothing else should be indented in a makefile). Because it is easier to parse, it cooperates better with tooling, which is especially important when it comes to accessibility tools like screen-readers.

                            Change my view :^)

                            1. 4

                              I agree the error messages are terrible, but you shouldn’t have to run the program to be able to spot that it has a syntax error in it. If you can have two files that are indistinguishable when viewed with cat but one is valid and the other has errors, you’ve made a terrible mistake as a language designer. Yes, you can teach your editor to be smarter about it, but there’s no reason that should be required, especially when the decision to use tabs doesn’t convey any benefit to the end user.

                              1. 2

                                A lot of the hate for tabs comes from not having it visible in text editors by default. I’ve been forced into using spaces in Python through autoformatting tools, and now need vertical guides to make sure I’m not one space off in my narrow font.

                              2. 1

                                Genuine question from a newbie: I’ve mostly seen CMake being used in projects, is Make still useful to learn?

                                1. 13

                                  Make is nice for very simple things, but as your project grows and requires non-trivial things, Makefile becomes a patchwork of tricks and hacks with obscure syntax impossible to google, and it’s an endless maintenance burden.

                                  I recommend sticking to basics and treating make as a launcher for scripts. If you need more, don’t dig deeper, but run away to a more maintainable system.

                                  1. 1

                                    Good advice, thanks.

                                  2. 12

                                    Make is OK for simple projects, but its lack of dependency analysis becomes really painful, and the need to manually update a file’s dependencies in the makefile leads to exactly those stale-binary bugs described in that article’s intro quote.

                                    In more detail: Say you have a C project. You add each of your .c files to the makefile. But for each .c file you have to list all the .h files it includes, as dependencies. Worse, this applies transitively, so you also have to list all the .h files included by those .h files, and so on.

                                    Any time you add/remove an include directive in your source code, you have to update the makefile accordingly. If you don’t, you can end up in a state where your binary contains stale code, which can be an absolute nightmare to debug.

                                    (This is also necessary with languages that use other mechanisms than direct inclusion. Basically you have to follow all the imports and duplicate them in the makefile.)

                                    Every non-tiny project I’ve worked on that used make also used some adjunct tool that scanned all the source files to identify dependencies and wrote the result as a makefile to be used by the build system. This was fairly kludgy. Eventually these tools became flexible enough that they took over the job — as with CMake, you work only with the more powerful tool and let it generate and run a makefile for you.

                                    1. 7

                                      I think it really is worth noting the distinction between:

                                      • make as interface to compiler/linker toolchain where the make targets are specific files to build/link
                                      • make as generic nearly-universally-available task runner not tied to compilation or even to languages which have explicit compile steps

                                      The latter is what a lot of modern “use Makefiles” articles are actually doing.

                                      1. 2

                                        Except make isn’t actually installed by default on most desktop or server operating systems.

                                        It’s an additional package, which just means it has been arbitrarily chosen and isn’t actually a lowest common denominator. It might as well have been bazel (bloated) or Procfiles

                                        1. 2

                                          I don’t know the situation on Windows, but on Linux and on macOS desktop, just bootstrapping a dev environment for lots of different languages will basically always pull in some form of make as a transitive dependency. I know, for example, that basically every language’s “set up dev environment on macOS” instructions begin with xcode-select --install, which I believe will implicitly install make.

                                        2. 1

                                          But is make any better for that than more modern tools?

                                        3. 2

                                          Thanks for the advise. I’ve noticed CMake is popular in projects that has a lot of libraries, I guess that’s why.

                                          1. 2

                                            Yes, CMake does all the dependency checking for you. It does a ton of other stuff too, but the drawback is it’s quite complex. Also it’s sort of a programming language, and a rather terrible one with even worse syntax than Perl.

                                            I keep meaning to look into other tools like Ninja…

                                          2. 2

                                            I feel like most of the criticisms of Make that I’ve read are really just criticisms of using Make to overcome the lack of a module system in C. The root cause of these problems doesn’t really have anything to do with Make, just bad language design.

                                            edit: not that I’m convinced Make would work smoothly for larger projects in languages that have a module system, but more that I’ve never seen any argument or evidence to the contrary.

                                            1. 2

                                              If C had modules, but cc still compiled a single source file like today, it would still have exactly the same problems with make. If you edit the source code of module foo, and module bar imports foo, you still have to tell make explicitly that bar.c depends on foo.c.

                                              I think what you are thinking of as a module system would encompass the C compiler itself detecting which source files are out of date and recompiling them, the way (say) javac does. But if it did that, you really wouldn’t need make; the compiler would in effect be running it for you, plus all the dependency analysis.

                                              In general I think make would be even worse for a language like the C in my paragraph 1 than it would be for regular C. That’s because without the source/header separation, any change to a module source file would have to trigger rebuilding all modules that import it. Make doesn’t know that you didn’t alter the module’s interface, it just knows the file’s mod date changed. This would result in much longer build times.

                                              1. 1

                                                But if it did that, you really wouldn’t need make; the compiler would in effect be running it for you, plus all the dependency analysis.

                                                Sure you do! You still need somewhere to keep the specific commands and flags that you want to invoke the compiler with, and somewhere to list all the different cross-compilation targets which need to be invoked in slightly different ways if your project supports multiple platforms.

                                                This is what “use make for what it’s good at” means to me. Yes, it’s unlikely your makefiles will ever grow beyond two or three pages. This is a really good thing.

                                                1. 1

                                                  So, make as simply a top-level launcher for build scripts? Like, so I can type make tests and it runs the shell commands under the tests: target? I guess that’s marginally cleaner than having a test.sh script as is my wont…

                                              2. 1

                                                What would Make be useful for in languages that have a module system?

                                                1. 3

                                                  Feels like a trick question, but, uh… to make compiled artifacts?

                                                  I use it with all my Fennel programs:

                                                  Often with phony targets for running tests and uploading artifacts to a release site.

                                                  It’s rare to have a Makefile over 2 pages in length, even when creating binaries for linux/windows/mac, but it’s still extremely useful.

                                                  1. 3

                                                    For one example, NodeJS has a module system and dependency management and all the other modern amenities, but its built-in “define some common tasks” story (for, say, gathering distributables, compiling templates, running multi-stage test workflows) involves stringing shell commands together in JSON. Make is a huge improvement.

                                              3. 2

                                                Depends what your goal is. If you’re interfacing with / building other projects then definitely. You seem to be in a happy environment where people use cmake, but globally we’re still 95% on autotools and make in the c/c++ world. But if you only work with your own things, yeah, maybe you can get away without learning makefiles.

                                                (just keep in mind, you can learn 90% of makefile functionality in a couple hours - basic functional knowledge is not a huge investment here)

                                                1. 1

                                                  Thanks. Could probably switch CMake to plain Makefiles in my project and see how it works out.

                                                  Edit: Now that I think of it, I may be having easier time using CMake for this project since it uses external libraries, and CMake makes that pretty easy. Could try Makefiles with a smaller project without external libraries.

                                                  1. 2

                                                    You should also have a look at Meson, I’ve had pleasant experiences with it, especially coming from autotools

                                                    1. 1

                                                      Yeah I’ve seen Meson used around, might use it for my next project(s)! :)

                                              4. 1

                                                I’d guess that the syntax of make is not what the secret sauce is. So if you found a modern replacement then it would slot in and be nicer with the trade-off of your new thing isn’t installed as often or well-known as make.

                                                Just‘s dotenv feature is quite a surprise, I just haven’t gotten around to using it yet but I have a growing list of system replacements.