1. 20
  1.  

  2. 8

    I think the convergence is happening and I’m largely unhappy with it so far.

    I’m not necessarily unhappy with the idea of all of these things together, but I’m unhappy that each language is doing it themselves and doing it poorly. Maven is a nightmare of pain. PIP is stupid. Gems are hairballs. Cabal is hell. Rebar is just crap. Go’s package manager is a joke.

    If one does any polyglot development (microservies make that ok, right?), one has to learn a bunch of tools that are all pretty broken in unique and terrifying ways. On top of that, it’s not possible to depend on the output of another language, for example writing a program that runs another program. Part of the problem is up top, every OS has its own package manager and development is entirely language driven, single people rarely write more than one language, and they don’t want to learn different package managers for everywhere they might deploy it.

    So if we’re going to breakdown these walls, can we at least do it right, and solve it in a language-agnostic way? The development tools I’ve build are really just translations to Makefiles. They take a project as input and give a Makefile that can build it and composes with other Makefiles. Makefiles are not perfect and I’d be happy to produce something else, but Makefiles are the best I have seen, so far. The value is that if I have N langauges, and all of them produce Makefiles, I can build the whole thing together and it Just Works.

    1. 7

      Go doesn’t have a package manager. They punted on it completely since it wasn’t in their problem domain. They have an abstraction over VCS which is convenient but not a package manager. This was probably smart on their part.

      Go the community has put out a few package managers but they haven’t really settled on one that wins yet. I think because in part the build everything from tip worked remarkably well for a long time. For most of my projects it still works.

      The problem with a language solving the whole package management thing is that almost no project or app is a single language application anymore. Which means that in the context of an application what you really need is a package manager that understands javascript, go, python, … And when you widen the scope to systems it gets even more crazy.

      At some point you end up with a package manager that is overlaying the language package managers and trying to resolve the dependency graphs between the ruby gem and the npm packages. And neither rubyhems nor npm really knows anything about the others dependency graphs.

      This same principle holds for build systems as well. For an application you need to build both javascript and {Go,Ruby,Python,…}. Indeed in some sense in the Ruby and Python case the “build” and the “packaging” are the same thing sense they aren’t compiled.

      1. 1
        1. 2

          This is a case of ambiguous terminology. The package here is the one defined by the package keyword. Which is not at all the same thing as python’s pip, Ruby’s gems, or Nodes npm. It’s not a manager it’s a convenient fetcher if you happen to have stored your go package’s source at the correct location and named your import path correctly. But it is not a package manager in the sense that this article uses the term.

          1. 3

            You say tomato I say tomaaaato. I don’t really see the difference in what this article talks about and go get. go get downloads and builds dependencies and builds them. This is a primary function of most package managers. It happens to be really terrible at the job, but it’s still solving the same function.

            1. 2

              It’s really good at downloading and building the dependencies of a package. What it isn’t good at and in fact doesn’t do are versioning and resolving dependency conflicts which is I think the piece that makes a package manager useful for most people. Otherwise all package managers would just be a thin veneer over wget and the compiler.

              Go get is a thin veneer over wget and the compiler and I believe was specifically made to be no more than that. i.e. the go developers didn’t want to solve the package manager problem with go get. They just wanted to make it easy to get and compile head for a Go package in the language spec sense of the term.

              1. 2

                I just don’t think you’re making a very meaningful or useful distinction. But to each their own.

      2. 5

        [..] I’m unhappy that each language is doing it themselves and doing it poorly [..]

        [..] can we at least do it right, and solve it in a language-agnostic way [..]

        See nix: http://nixos.org/nix/

        I already used it to replace most of the package managers you’ve listed on my system. I’m also looking into using it to replace npm via https://github.com/adnelson/nixfromnpm (generating flat dependency trees instead of the insane thing npm does)

        Not to mention the Haskell infrastructure is kickass, and way beyond what cabal and stack can do.

        For people who use nix, this is already a solved problem! Just need more manpower to bring all of the package sets up to the quality of the Haskell ones.

        1. 1

          Yep Nix is great, although it is not a build system, it at least makes it easy to call out to anything that builds. I really need to work on getting Nix to work on FreeBSD again :(

        2. 4

          Bazel is a polyglot build system.

          1. 1

            I love Bazel. I recently contributed Mono C# support to it since I’ve been doing .Net stuff lately and the msbuild tooling sucks.

          2. 1

            I’m happier with the state of all those ecosystems than I am with C’s. I think I’m largly happy with it.

            Nix could be nice if I could use it without changing OSes, and could trust it that I wasn’t going to run into something they haven’t covered.

          3. 4

            I think the convergence is happening and I’m largely happy with it. I don’t see a lot of value in this kind of separation.

            If you do want to be able to do this kind of thing I think you have to limit how much circularity is allowed. E.g. perhaps only types within the same package/module may circularly depend on each other, and the dependencies between packages/modules are required to form a DAG. (Most build systems already require that the dependencies between build-level modules/projects form a DAG). Then an IDE can make a call to rebuild a module and all downstream modules.

            But even then you end up duplicating the dependency-resolution logic, because dependency-resolution of source files within a build project is substantially the same problem as dependency-resolution of multiple projects in a build. And what does it gain you?

            Look at it from the other side: what are the natural split points? Maybe a code module or even an individual class could be the same thing as a project. Maybe every build could be a release, every class or module having its own version number - compare how in the age of git what would have been a single commit in CVS is now more likely to be a branch, and almost every edit becomes its own commit. At that point there’s no build system per se - every build is managed by the package manager, and the only split is between “compile single package” and “manage dependency graph and release process”. That can fit with the IDE. The main problem is the practicality of naming things, but I can see that working again in git-like fashion - every build has a hash ID, the ones we want to “release” can be given meaningful names via tags/branches.

            1. 2

              But even then you end up duplicating the dependency-resolution logic, because dependency-resolution of source files within a build project is substantially the same problem as dependency-resolution of multiple projects in a build. And what does it gain you?

              I don’t really understand what you’re saying here. We can split the problem into two subproblems:

              1. Walking a dependency graph and performing some work.
              2. Discovering the dependencies of something.

              The first problem can be solved entirely generically. The second one is up to a language or system to be able to generate. There is no duplication of logic here.

              1. 1

                The second one is up to a language or system to be able to generate.

                Sure - but drawing a distinction between solving it for local modules and remote ones (i.e. between a build system and a (language - I’m not really interested in OS package managers one way or the other) package manager) is, I think, artificial.

                1. 1

                  In what way? Are you assuming local and remote are in the same language? If they are in different languages, is there not a meaningful distinction?

                  1. 1

                    Are you assuming local and remote are in the same language?

                    Yes - I’m thinking of something like maven or npm or pip.

                    1. 1

                      Do you still think it’s artificial if you packages are in different languages? Or depending on binary packages? I write a lot of software that is in language X, but needs binary package Y installed, or depends on something from another language. Examples could be a Python program that needs a java program installed to work or something like libpcre.

                      1. 2

                        I think on a theoretical level it’s still artificial - e.g. if you update the version of Y in your IDE then ideally you want it to be able to run only the part of your testsuite for your X program that depends on Y. In practice I can see using a clever language-aware dependency manager when building your X program and falling back to a lowest-common-denominator tool like make for bridging the gap between X and Y, but having more information than make gives is valuable and we should be looking to extend that cross-language rather than reducing everything to the level of make.

                        I think this proposal enshrines the “files = build targets” notion of make which is the wrong abstraction, and isn’t structured enough to be truly useful to IDEs. If I were writing my own strawman I’d look to decouple the build definition from the implementation - so the definition would have something like “target foo is built from foo.c and bar.c as step 12345” and 12345 would be an abstract entry in a registry representing the concept of “compile as C99”. And an IDE might know that it had a native implementation of 12345, or it might shell out to some shell-based implementation (perhaps there would be some kind of registry of shell implementations, and a parallel registry of windows-based implementations - of course custom build steps might only be implemented on one platform, or might only have an implementation in some company’s internal systems). Hopefully in this model it would be possible to separate “build this executable and run it to generate some source” from “build this executable as an end target”, which could make cross-compilation less of a mess. But I haven’t thought about this too much.