1. 18
  1.  

  2. 10

    Actually reproducible Docker build:

    $ niv init
    $ niv update nixpkgs -b nixos-20.03
    

    Create a file default.nix with the following contents:

    with import (import nix/sources.nix).nixpkgs {};
    
    dockerTools.buildLayeredImage {
      name = "myapp";
      config.Cmd = [ "${myapp}/bin/myapp" ];
    };
    

    Then:

    $ nix-build
    

    And you have a Docker image. This will pin all dependencies, including libc, your Python interpreter, etc. Of course, you can extend it from there with your own derivations, overrides, etc.

    1. 4

      Or without using niv, use this default.nix:

      { pkgs ? import (fetchTarball {
          url = https://github.com/NixOS/nixpkgs/archive/20.03.tar.gz;
          /* generated with nix-prefetch-url --unpack (channel URL) */
          sha256 = "0182ys095dfx02vl2a20j1hz92dx3mfgz2a6fhn31bqlp1wa8hlq"; 
        }) { /* any import options, left empty by default */ }
      }:
      
      with pkgs;
      
      let
        myapp = stdenv.mkDerivation {
          name = "myapp";
          src = ./myapp;
          buildInputs = [ bash ];
          installPhase = ''
            mkdir -p $out/bin
            cp bin/myapp $out/bin
          '';
        };
      in
      dockerTools.buildLayeredImage {
        name = "myapp";
        config.Cmd = [ "${myapp}/bin/myapp" ];
      }
      

      Note that myapp/bin/myapp is just

      #!/bin/sh
      
      echo "Hello, world!"
      

      Also note that mkDerivation will automatically patch myapp’s interpreter to point into the Nix store:

      /nix/store/j6l0bfcylln7s74s4jvz1mbd91nzgcsg-myapp/bin/myapp: interpreter directive changed from "/bin/sh" to "/nix/store/7ygifkrn7sbirhi53rh1bpq9cym6mjy9-bash-4.4-p23/bin/sh"
      
      1. 1

        Won’t that fail when the channel updates and its sha changes? I normally pull a specific rev tarball from the nixpkgs github.

        1. 1

          20.03 is a tag in this case that only gets made when 20.03 is released, so it functions identically to a revision because I don’t think they ever force push them. I think you’re thinking of the release-20.03 branch that the channel tracks.

    2. 6

      This has the same problem as referencing a Docker image by hash:

      RUN apt-get install -y nginx=1.14.2-2+deb10u1
      

      There will eventually be another nginx release, and eventually the version you pinned will disappear from the repository. To make this work you’ll need to maintain an apt repository that retains every version you’ve ever pinned.

      The “starting point” advice at the bottom is good, however.

      1. 2

        The “starting point” advice at the bottom is good, however.

        I’m not sure if it is, tbh. Yes, pinning versions of packages in pip’s requirements.txt is good advice if you want to rebuild something and know it will work. But it’s bad advice if you care about security fixes, bug fixes that newer versions can bring (along with the risk of breaking changes/new bugs).

        There’s a balance somewhere in there between “never update anything, ever” and “edge or GTFO”, how do you find it?

        1. 3

          Finding that balance is definitely tough.

          I think pinning is good for reproducibility, but it’s important to bump your dependencies to whatever is getting security fixes. Unfortunately not every package has a long-term-support channel.

          1. 2

            Yes, you have to periodically re-pin at the least.

            Also, just having a full list of your dependencies is useful. For example, GitHub can generate security notices if you’ve pinned a known-vulnerable version. Dependabot knows how to automatically issue update PRs when new releases occur.

            1. 2

              That’s why you need tests, and lots of them: then you can safely bump dependencies and have your CI pipelines catch issues that arise (if you are using GitHub, Dependabot’s PRs will get automatically tested).

              Of course, some dependencies (those interfacing with outside systems) will be mostly mocked out in tests, so you will still need to do some manual testing if those get updated.

            2. 2

              The longer I do this, the longer I think I need a way to get notifications from dependencies when they patch a security issue, so I can make sure to apply it.

              That’s my path to get off the treadmill of edge while keeping security patches applied.

              I’m a fan of the counterintuitive approach proposed by one of the go package managers - installing the oldest version that matches your dependency list instead of the newest.

              This reduces the need for a lock file (older versions do not suddenly appear) and tends to give you combinations of software that are compatible with each other.

              The downside is you now need a way to figure out what versions you can/should bump, and tooling to apply those changes to your dependency list. However, those have been implemented.

              1. 2

                GitHub will do this for you if your dependency-spec file (for any of various languages) is checked in.

                For Python specifically, there are tools like pyup and requires.io that will notify you of new releases of dependencies (and which ones are security issues). I’m sure there are similar services for other languages.

              2. 2

                I recommend pinning everything and having automation to update those pins. A tool like https://github.com/renovatebot/renovate can automatically create pull/merge requests and merge them if the tests pass.

                1. 1

                  There are other articles on the site about doing security updates and the like, but it’s true it’s not mentioned there. I’ll try to update it with some links tomorrow. Or, perhaps, that may be a whole new article to write…

                2. 1

                  I like to use https://hub.docker.com/r/debian/snapshot which is based on a specific timestamp of snapshot.debian.org