Threads for mindlessness

  1. 9

    If you don’t want to bother with .PHONY targets, tabs, weird syntax, and arcane built-in assumptions about build processes, check out Just, which is Make-inspired, but firmly focused on running commands, not producing files.

    1. 10

      I have a really hard time imagining what problems this solves that aren’t already solved by “a bin/ directory”?

      1. 1

        For example, just recursively walks up searching for justfile, so you can be deep in the project and still run just build, without clobbering PATH.

        Consider the case where there are multiple projects, you’d have to use relative paths, or insist on unique name for scripts across all projects, or constantly reset PATH to project’s bin.

        1. 2

          I imagine you could use direnv for this, too—you could configure it so that whenever you enter a directory within ~/Projects/someproject it adds ~/Projects/someproject/bin to your PATH, and it would undo the change if you entered some other hierarchy. If you’re collaborating with others then I imagine that getting them to install Just would be easier than getting them to install and configure direnv, though.

          1. 1

            I solve this problem a simpler way; I always have a shell open in the root of any project I’m ever working on, so bin scripts are very easy to use.

      2. 7

        tabs

        yeah because that’s the problem with Make, it doesn’t use spaces.

        1. 1

          I have seen a few people mention Just. It looks like, while it does have a concept of dependencies, it doesn’t have a way to track if that dependency is satisfied or not, rather it just always runs all dependencies. Does it have a way to detect if the dependency is satisfied? In a Makefile, this is where the file timestamps play a role (and as I showed, we can use this even for tasks that don’t produce files).

          1. 1

            AFAIR, Just always runs the dependencies, it is simpler mental model. This issue recommends using make in tandem with just when you want incremental runs.

        1. 2

          One of my favorite things about (GNU?) Make is that it already comes with a bunch of rules, so for a C project you can just do something like this:

          CFLAGS = -std=c11 -O2 -Wall -Wextra
          
          foo: foo.o bar.o
          

          I didn’t like the author’s Ruby Makefile at the bottom of the post, which was just filled with phony rules. I think that a small shell script would be nicer:

          #!/bin/sh
          
          case "$1" in
              "build")
          	echo "wowee building the program"
          	npm rake gulp grunt npx build
          	;;
              "deploy")
          	echo "deploying the cool program"
          	scp -r src/ remote:/srv/program
          	;;
              *) echo "Unknown command $1" ;;
          esac
          
          1. 7

            It’s a trap.

            Every makefile starts beautifully clean. And then you need dependencies. Which are slightly different between OSes. And then you want an out-of-tree build. And then you want to cross-compile. And then… all built-in rules work against you, and the whole thing is a mess.

            1. 2

              That’s true :( - ime cross-compiling works quite well, just setting CC and adding --sysroot=/blah/ to the CFLAGS, but I usually always end up switching to CMake.

              Maybe if you wanted to keep it simple you could write a configure script in shell or Python to generate your Makefile without conditionals and other funkiness.

              1. 3

                Then you would reimplement what’s already written – CMake. But if someone thinks CMake is “too big”, then a good alternative would be Meson, which would be a configure-like (or CMake-like) script written in Python.

                I don’t really see any real arguments for writing a custom configure script, unless the projects absolutely needs to have a custom one, but 99,9% of projects are not that custom.

            2. 3

              Most systems have tab completions for makefiles

              1. 1

                That’s the main reason for me to use them.

              2. 3
                1. Those rules are not tracking header dependencies for C files. So if you use any local headers in your project, changing them will not recompile the project, and will only produce confusion for people who expect dependency tracking to include headers.

                2. Your shell script will ignore errors in commands, while Makefile script will fail entirely if 1 command from it will fail. To have your shell script behave similar to a Makefile, add “set -euo pipefail” directive at the top of the file. But then be prepared people will be confused how the shell script works, because they won’t notice this “set” directive ;)

                1. 1

                  But this doesn’t manage the dependency relationship between those. If you just write the shell script, you will have to re-resolve the ruby dependencies, re-resolve the node dependencies, and re-compile the javascript every time you build, all of which will greatly increase the build time. With the Makefile, we detect if those steps are needed or not. Take another look.

                1. 3

                  Makefiles are awesome! They’re one of those rare “learn it for life” tools. You may use meson, npm, or rake now, but who knows in 5 years. Make will likely be around forever.

                  1. 2

                    Yes! Exactly that!

                  1. 4

                    Every time I start out with a Makefile, I end up instead with a bin/ directory full of shell scripts that work much better.

                    1. 2

                      But, how do you manage the dependencies between those scripts? You could still use a Makefile that calls those scripts, and then you get to define the dependency relationship between the scripts.

                      1. 1

                        It usually turns out that the dependencies aren’t usefully managed by a Makefile anyhow, and it’s less painful to just blow everything away and recompute each time if there are any.

                    1. 7

                      I’ve never really seen the point of these kind of Makefiles to be honest: it doesn’t use any of make’s dependency resolving features (“rebuild only a.c and b.c when b.c changes”) and is essentially just a shell script with extra steps and extra clunky syntax. I’d rather write just a shell script.

                      1. 1

                        What kinds don’t use make’s dependencies? The example I gave does not re-resolve dependencies with bundle unless the file that declares the dependencies has changed. It also does not re-compile the javascript unless any of the javascript has changed. This is huge, in particular because the js and ruby tools on their own do not do this.

                      1. 2

                        Using Makefiles in order to automate you shell tasks is very fine, but please don’t use it to build software applications, or in other words: don’t use it in projects you intend others will use at some point.

                        Related article about it: http://anadoxin.org/blog/is-it-worth-using-make.html

                        1. 2

                          I’m not sure what it means for a Makefile to “support Visual Studio”. This type of thinking seems to put the requirement in the wrong place. Visual Studio, or whatever IDE, should be able to run an arbitrary command at least. But, the question that seems worth asking is why doesn’t Visual Studio support Makefiles? I mean, look, if someone is working in a .NET shop, of course just use the tooling that is available and out of the box. But this doesn’t seem to be an argument against makefiles in general.

                          1. 1

                            It means to be able to use the tool with Visual Studio.

                            Even if VS would support running a custom command, it’s still not enough. Running a custom command doesn’t inform the IDE about any definitions, include directories, relationships between files. It’s all hidden in a Makefile. So, VS would need to be able to interpret a Makefile directly in order to be able to extract this information. But, this still wouldn’t be enough, because then VS still wouldn’t be able to extract include directories, defined symbols information, linked system libraries, etc.

                            And CMake supports VS by having the ability to generate *.vcxproj. Those project files contain all necessary information so that VS can have all intellisense features enabled.

                            I agree that make and CMake are different tools, and I’m not requiring that make should support Visual Studio. I’m just saying that it doesn’t, and that for some people it’s a problem.

                            Would it be cool if VS would support Makefiles? Sure. But it doesn’t.

                        1. 1

                          Make isn’t great for running commands, because multi-line commands are difficult, and it can’t modify the environment (e.g. change directories or set environment variables)

                          So I made .go.sh, which I use heavily. It supports both shell functions and Python functions, and M4 macros. See here for a bunch of examples.

                          Make is great for what it’s designed for, though. I (ab)use it plenty; see any of the Makefiles in PoprC.

                          1. 3

                            See the .ONESHELL phony target. Is this what you want? (i.e multiline and cd)

                            1. 1

                              That doesn’t work because it creates a new shell, but I want to modify the existing shell.

                              1. 1

                                I am not sure I understand, are you talking about setting environment variables? This can be done with export in GNU Make.

                            2. 2

                              Yeah, it’s not the most intuitive tool, of course, and it does have some peculiarities, but I disagree on the point that you can’t modify the environment.

                              If you need to cd into a directly, you can write that as part of the recipe. And for environment variables, this is also straightforward.

                              mytarget:
                                MY_ENV=$(MY_ENV) cd some/directory && ls -ahl
                              
                            1. 1

                              I think Make is a poor choice for a simple wrapper script, and a disasterous choice for a sophisticated one. Your make tasks are shell code. If you don’t need the dependency invalidation, it is better to write a shell script so you only have to deal with one layer of arcane syntax, instead of two. You can e.g. run a linter on your Bash inside a bash script, but you don’t get any static analysis of the little snippets of bash you embed in your Makefile. Bash will let you break your code up into functions. The closest thing Make gives you to that are macros – which are an absolute nightmare. If you want to experiment, you can copy and paste your directly from your script into the bash prompt. With Make, there is no repl.

                              If you do need the dependency tracking, Make is also a bad tool. It has two modes: fail silently without any explanation of what tasks it performed and why – or extremely verbose mode where I will emit millions of lines and it’s up to you to weed out the interesting stuff. Again, you can’t write functions so there is no capacity for abstraction or code re-use. Also, using timestamps instead of checksums for invalidation will break spectacularly if you are trying to have any sort of build cache over a network.

                              Make is fine until it gets bigger than say 200 lines or until you have to reach for the manual. After that, i would rewrite in Shake, Bazel, or… anything else.

                              1. 1

                                Your points are valid. In my experience, having a dependency tree and a tool that is already available on all systems, is worth the trade-offs. Absolutely, their are poor Makefiles, and hugely complex Makefiles, but I think as a wrapper for the common case of node / ruby / elixir projects, where you are just calling out to docker and/or the build tool, maybe some ad-hoc commands, is the right amount of simplicity for a Makefile to be productive.

                                1. 1

                                  Yeah there’s definitely a sweet spot. I just broke my rule and introduced a 50-line Makefile for a 48-hour hackathon project that relied on a dependent series of JSON files of data scraped from Wikipedia.

                                  (I would have used shake but 48 hour project is not the time to thrust Haskell upon 2 teammates)

                                  But I “has dependency graph, so Bash won’t do” and “not complicated/important enough that’s it’s worth sacrificing ubiquity and familiarity for a more disciplined tool” is a relatively small space.

                                  1. 1

                                    And I mean, I wasted probably 20 min total dealing with stupid mistakes putting together/working with even this 50-liner Makefile that uses nothing advanced – just because Make has no linter, and doesn’t allow the introduction of intelligent programming practices like functions and variable

                              1. 2

                                The example is missing prerequisite:

                                file_1:
                                    touch file_1
                                
                                file_2: <-- it should include "file_1" in here
                                    touch file_2
                                

                                It seems like a typo.

                                1. 1

                                  Thanks, I fixed this!