I started reading this long before I saw who the author was. I’ve followed Raph’s work for some time — this is the most puzzling thing I’ve read. With the mix of obviously true statements and less clear ones (and his comment here) it still isn’t clear to me if this is satire or not.
I think it’s likely true he is a good enough programmer to do good work in a language without memory safety. I am not. Using Rust has taught me a lot and the compiler being so picky has greatly improved the quality of the apps I’ve written in it.
The complaints about the cargo monoculture are spot on though. It does a good job of some things, but it is terrible at others. It is also openly hostile to efforts to make it play nicer with actual real whole project build tooling.
Ok, I’ll come clean. It is satire, but intentionally written in a way to be very persuasive, including starting with some true things. I am not a smart enough programmer to avoid memory errors, though I am interested in having tools where I’d be able to prove the necessary invariants.
The Cargo monoculture is a complex topic. For small projects and new programmers, Cargo is amazing, and there are few tools in its class. For larger projects where you need to compile multiple languages, it gets limiting fast. I think Bazel is a good step in the right direction, but not friendly enough to be used effectively at small scales.
It is satire, but intentionally written in a way to be very persuasive […]
In that case, mission accomplished.
I started reading it not realizing the date and everything made sense…until it didn’t and I wasn’t sure if you’d lost your marbles or I was totally out of the loop. Then I saw the date and realized I’d been bamboozled. Then I started doubting that conclusion all over again since there was more true stuff in the middle and end too.
yeah the Cargo monoculture worries me a bunch. I use it via nixpkgs, which pulls it apart and recasts it in a frame where I can substitute things uniformly when I really need to, but even so, this whole thing about language-specific package ecosystems feels very…. isolationist
I think Bazel is a good step in the right direction, but not friendly enough to be used effectively at small scales.
I would argue an even bigger issue with Bazel when trying to use it instead of Cargo is that it’s a monorepo-centric build system without a package manager and where you are expected to vendor all your dependencies.
What I believe is needed is a real build system that is capable of building foundational C/C++ libraries (to replace build.rs) plus a language-agnostic/multi-language package manager to provide the functionality of Cargo.
a language-agnostic/multi-language package manager
I am not sure that’s possible, if you also want the thing to be more specific than “runs arbitrary code”. Languages are far too different, a Rust library, a C library and a Java are very different things, and shoving one into the other is not possible without fundamentally hacky solutions (speaking from experience: I tried shoehorning Rust’s into IntelliJ’s JVM model, and Kotlin into CMake).
I would say that language specific package managers should exist, but a stricter separation between “package manager binary” and “package specification” should be made. For Rust, that means that the contents and semantics of .crate archives, registry API, version resolution and build process should be documented public API, such that it’s easy to provide alternative implementations. Cargo itself should be seen as one of the consumers of packages, rather that a thing that defines what the package is.
This is a very good point—you explain well why language-specific package managers are a good and necessary thing.
I think there is a different problem here people are conflating too: package manager and packaging tools. Actually there are three parts: a tool to download dependencies for a project, a build system that assembles them all, and a packager that bundles the results up in away that can be consumed. I think cargo gets in trouble by trying to do all three, and in reality it only does the first two well and the third terribly. Because they are really stuck on 1 being the same as 3 they refuse to improve.
I think language specific tooling for the first two are generally good. And for libraries, the third isn’t really needed. But when the project is an app or service, assuming Rust is the only language involved and hence Cargo should be able to do everything is a bad move. My apps may have lots of other resources besides a single binary than needs to be installed, and shipping that in a way that is easy for both end users to install from sources and distros to bundle into package is not easy. Rust projects make it extra hard.
shoving one into the other is not possible without fundamentally hacky solutions
Why do we need to shove one into the other? I don’t think making, say, a library for every language look the same is a requirement. The build system/package manager just need to have a general enough model to allow building things for various languages. There are some tricky aspects, I am no denying it. For example, Rust allows multiple versions of the same crate to exist in a single build while C/C++ (normally) does not. But I don’t think this is insurmountable.
I would agree though that trying to generalize something like CMake into supporting this is most likely futile; something new needs to be designed with this as a requirement from the grounds up. Which is what we’ve been doing in build2. We have fairly complete support for C/C++ (we are able to build libraries like Boost and Qt along with their dependencies on all the major platforms), we have support for ”bash modules” and have bash packages published, and we have the nascent Rust support. So far, I don’t see why a language-agnostic/multi-language build system and package manager would be fundamentally impossible.
I firmly believe that programming languages should be free of politics, focused strictly on the technology itself. In computer science, we have moved beyond prejudice and discrimination. I understand how some identity groups may desire more representation, but I feel it is not necessary.
This is a really harmful way to end a satirical blog post.
After posting this, I realize that the timing is unfortunate, I think a lot of people are going to assume it’s an April Fools piece, just because of the date.
I’m trying this as well. Looked at conan, hunter, vcpkg, bazel, meson, cmake, etc. Then I wondered: why complicate things? Why not just install whatever the OS package manager gives me? Why not create a simple makefile to build the whole project and ccache to speed it up?
Why not just install whatever the OS package manager gives me?
Because that doesn’t work on OSes that don’t come with a package manager (saying they suck so far hasn’t solved the problem nor reduced their popularity).
Combined with a “simple makefile” it’s also not very robust on OSes that have some package manager, but don’t use /usr/lib and/or pkg-config or whatever the Makefile assumes.
It makes the build require manual steps, which can’t be documented in a straightforward way, because they’re OS-dependent.
It gets even trickier when the code is being compiled for a different OS than the host.
there are several package managers these days that exist separately from the OS (homebrew, cygwin, nixpkgs) which is certainly a nice solution when it works, but they all have their own limitations and none are fully cross-platform, as yet
I agree with all. Previously I had provision_.sh for each OS and its variant I needed to target. It works for internal projects that run on a few computers.
Windows, Android, iOS, and macOS add up to pretty much all of the client OS market. None of these provide a package manager that can install shared libraries out of the box and two of them have application sandboxing models that mean that there aren’t usable add-on ones that do.
Exactly. This blind spot for Windows (“what do you mean there’s no pkg-config!? Why don’t you like cygwin!?”) is why I roll my eyes when people praise C for portability. It can run on every computer in a museum, but half of software doesn’t build on the most popular desktop OS.
CMake with vcpkg has worked well for me (I presume Conan is similar). It will build with the Visual Studio toolchain and it builds all of the dependencies with the same toolchain so that you don’t get ABI mismatches (on Windows, there’s no stable C++ ABI). Writing a Makefile by hand that does this is a problem. The flags for the compiler are different, even installing a POSIX Make is non-trivial (and you are probably using some GNU Make extensions if you want incremental builds to work right).
macOS has .pkg files (see: https://www.manpagez.com/man/8/installer/osx-10.4.php), Windows uses MSI packages and they’re managed through the Add/Remove Software UI. Both of these can be (and are) used to manage the installation of shared libraries.
Android and iOS don’t support end-user installed shared libraries.
I don’t count Apple’s pkg files as a package manager because they don’t track dependencies and, crucially, they don’t have an uninstaller (some folks have tried writing third-party ones). MSIs, similarly, don’t have dependency tracking. I can’t say ‘I depend on package X’ and have the MSI infrastructure fetch it. I can package things as MSIs, but there’s no mechanism to say ‘I depend on libfoo, so please make sure libfoo version >= 4 is installed’.
In contrast, with FreeBSD’s pkg, Debian’s apt, RedHat’s yum, and so on I can do exactly that and ship a .pkg / .deb / .rpm that pulls in my other dependencies from either the main repo (if they exist) or my repo (if I provide one). On macOS, I can do this via Homebrew, but it’s not part of the core OS. I am not sure if winget supports this on Windows.
I can use vcpkg or Conan on all of these platforms to build libraries that I can bundle with / link against my application, but then I’m not able to use the same shared library that other things may be using.
Lately for me the thing has worked best is CMake + submodules. Where the submodules also use CMake. And I do add_subdirectory in CMakeLists.txt. I also have a little ‘run.sh’ script in root that does the CMake invocations or other things.
The CMake language was originally a very simple macro language. It was translated with simple macro expansion and so didn’t even have a proper concept of scopes. This is how you ended up with things like endif needing the same macro arguments as the if() that it matched. The CMakeLists.txt was a list of macros to apply to generate a build system.
It’s gradually evolved into a not-terrible build configuration language and if you learn modern CMake then you can avoid a lot of the warts, but there’s still a lot of the legacy embedded in the system. It also suffers a bit from the fact that the CMake language used to be far less expressive and so there are a lot of things done in hard-coded bits of logic that should be in packages (e.g. deciding to pass the flag to cl.exe that specifies C or C++, meaning that you can’t then use any other language that clang-cl.exe supports).
CMake is one of the things that I use because I dislike the alternatives more.
I’ve recently used xmake and there are a few things I really like about it (not least that it starts with Lua as the config language), but there are a lot of problems with it. It doesn’t really have a clear abstract model of dependencies and it is not extensible in places where it should be (you can’t define a new kind of target, for example). It also defaults to doing in-tree builds, which I thought we’d all established was a terrible idea 20 years ago.
No, but a Python dependency in my build system is a non starter for me. Python is the cause of enough packaging and versioning problems that I can’t imagine it being the solution to any.
Yeah the language doesn’t make much sense to me and I just remember the incantations that have been relevant for me haha. The main thing is just that almost all dependencies I’ve wanted to use have had a CMakeLists.txt. And when they don’t it’s easy enough to create my own library target at the top-level and include their sources directly.
Have you tried FetchContent? It can download tarballs/clone Git repositories at configure time and add them to the build. There’s also an argument to make it run find_package first and only do the clone if that fails, which can make it easier for people to use the system version of a library (if one exists).
I’ve considered it; but I like that submodules are visible in GitHub and to me they feel more predictable (to the extent that submodules do…). It basically feels like vendoring but being able to upgrade the upstream version slightly more easily. Re: using system library when it exists – I actually do want the predictability that vendoring gives me in most cases. And I like the bonus of being able to browse the source code of deps within the same tree / “go to definition” in my editor works to go to source of deps. Some times I may have to fork a dep or just try making a temporary change for debugging (the reality of software dev…) and having code in-tree is really nice for that.
But yeah I think there are situations where FetchContent makes sense probably – maybe if your repo tends to itself be a dep or something; or yeah when you do want to use system libraries when available. My projects tend to be applications and so vendoring all the dependencies in and having predictability has been good.
Here’s an example of how I tend to do things – https://github.com/nikki93/raylib-template. The ‘run.sh’ works across macOS, Windows and Linux for me (I use WSL on Windows).
I should write one explaining how I’ve learned to love Python packaging.
I literally never had an issue with Python packaging. But then again I don’t maintain a library.
The April 1st window for this year is gone but I can do one about how I fell in love with
autotools
again for next year!python -m build .
twine upload
I started reading this long before I saw who the author was. I’ve followed Raph’s work for some time — this is the most puzzling thing I’ve read. With the mix of obviously true statements and less clear ones (and his comment here) it still isn’t clear to me if this is satire or not.
I think it’s likely true he is a good enough programmer to do good work in a language without memory safety. I am not. Using Rust has taught me a lot and the compiler being so picky has greatly improved the quality of the apps I’ve written in it.
The complaints about the cargo monoculture are spot on though. It does a good job of some things, but it is terrible at others. It is also openly hostile to efforts to make it play nicer with actual real whole project build tooling.
Ok, I’ll come clean. It is satire, but intentionally written in a way to be very persuasive, including starting with some true things. I am not a smart enough programmer to avoid memory errors, though I am interested in having tools where I’d be able to prove the necessary invariants.
The Cargo monoculture is a complex topic. For small projects and new programmers, Cargo is amazing, and there are few tools in its class. For larger projects where you need to compile multiple languages, it gets limiting fast. I think Bazel is a good step in the right direction, but not friendly enough to be used effectively at small scales.
In that case, mission accomplished.
I started reading it not realizing the date and everything made sense…until it didn’t and I wasn’t sure if you’d lost your marbles or I was totally out of the loop. Then I saw the date and realized I’d been bamboozled. Then I started doubting that conclusion all over again since there was more true stuff in the middle and end too.
thanks for owning up to that <3
yeah the Cargo monoculture worries me a bunch. I use it via nixpkgs, which pulls it apart and recasts it in a frame where I can substitute things uniformly when I really need to, but even so, this whole thing about language-specific package ecosystems feels very…. isolationist
I would argue an even bigger issue with Bazel when trying to use it instead of Cargo is that it’s a monorepo-centric build system without a package manager and where you are expected to vendor all your dependencies.
What I believe is needed is a real build system that is capable of building foundational C/C++ libraries (to replace
build.rs
) plus a language-agnostic/multi-language package manager to provide the functionality of Cargo.I am not sure that’s possible, if you also want the thing to be more specific than “runs arbitrary code”. Languages are far too different, a Rust library, a C library and a Java are very different things, and shoving one into the other is not possible without fundamentally hacky solutions (speaking from experience: I tried shoehorning Rust’s into IntelliJ’s JVM model, and Kotlin into CMake).
I would say that language specific package managers should exist, but a stricter separation between “package manager binary” and “package specification” should be made. For Rust, that means that the contents and semantics of .crate archives, registry API, version resolution and build process should be documented public API, such that it’s easy to provide alternative implementations. Cargo itself should be seen as one of the consumers of packages, rather that a thing that defines what the package is.
This is a very good point—you explain well why language-specific package managers are a good and necessary thing.
I think there is a different problem here people are conflating too: package manager and packaging tools. Actually there are three parts: a tool to download dependencies for a project, a build system that assembles them all, and a packager that bundles the results up in away that can be consumed. I think
cargo
gets in trouble by trying to do all three, and in reality it only does the first two well and the third terribly. Because they are really stuck on 1 being the same as 3 they refuse to improve.I think language specific tooling for the first two are generally good. And for libraries, the third isn’t really needed. But when the project is an app or service, assuming Rust is the only language involved and hence Cargo should be able to do everything is a bad move. My apps may have lots of other resources besides a single binary than needs to be installed, and shipping that in a way that is easy for both end users to install from sources and distros to bundle into package is not easy. Rust projects make it extra hard.
Why do we need to shove one into the other? I don’t think making, say, a library for every language look the same is a requirement. The build system/package manager just need to have a general enough model to allow building things for various languages. There are some tricky aspects, I am no denying it. For example, Rust allows multiple versions of the same crate to exist in a single build while C/C++ (normally) does not. But I don’t think this is insurmountable.
I would agree though that trying to generalize something like CMake into supporting this is most likely futile; something new needs to be designed with this as a requirement from the grounds up. Which is what we’ve been doing in
build2
. We have fairly complete support for C/C++ (we are able to build libraries like Boost and Qt along with their dependencies on all the major platforms), we have support for ”bash
modules” and havebash
packages published, and we have the nascent Rust support. So far, I don’t see why a language-agnostic/multi-language build system and package manager would be fundamentally impossible.This is a really harmful way to end a satirical blog post.
After posting this, I realize that the timing is unfortunate, I think a lot of people are going to assume it’s an April Fools piece, just because of the date.
Is it not?
It is. The author admitted it a few hours later.
[Comment removed by author]
Thanks, I hate it.
I’m trying this as well. Looked at conan, hunter, vcpkg, bazel, meson, cmake, etc. Then I wondered: why complicate things? Why not just install whatever the OS package manager gives me? Why not create a simple makefile to build the whole project and ccache to speed it up?
/usr/lib
and/orpkg-config
or whatever the Makefile assumes.yeah, it’s a serious problem
there are several package managers these days that exist separately from the OS (homebrew, cygwin, nixpkgs) which is certainly a nice solution when it works, but they all have their own limitations and none are fully cross-platform, as yet
I agree with all. Previously I had provision_.sh for each OS and its variant I needed to target. It works for internal projects that run on a few computers.
Every OS comes with a package manager. At least all the popular ones.
Windows, Android, iOS, and macOS add up to pretty much all of the client OS market. None of these provide a package manager that can install shared libraries out of the box and two of them have application sandboxing models that mean that there aren’t usable add-on ones that do.
Exactly. This blind spot for Windows (“what do you mean there’s no pkg-config!? Why don’t you like cygwin!?”) is why I roll my eyes when people praise C for portability. It can run on every computer in a museum, but half of software doesn’t build on the most popular desktop OS.
CMake with vcpkg has worked well for me (I presume Conan is similar). It will build with the Visual Studio toolchain and it builds all of the dependencies with the same toolchain so that you don’t get ABI mismatches (on Windows, there’s no stable C++ ABI). Writing a Makefile by hand that does this is a problem. The flags for the compiler are different, even installing a POSIX Make is non-trivial (and you are probably using some GNU Make extensions if you want incremental builds to work right).
[Comment removed by author]
That’s why I said:
Both vcpkg and Conan are package managers.
macOS has .pkg files (see: https://www.manpagez.com/man/8/installer/osx-10.4.php), Windows uses MSI packages and they’re managed through the Add/Remove Software UI. Both of these can be (and are) used to manage the installation of shared libraries.
Android and iOS don’t support end-user installed shared libraries.
I don’t count Apple’s pkg files as a package manager because they don’t track dependencies and, crucially, they don’t have an uninstaller (some folks have tried writing third-party ones). MSIs, similarly, don’t have dependency tracking. I can’t say ‘I depend on package X’ and have the MSI infrastructure fetch it. I can package things as MSIs, but there’s no mechanism to say ‘I depend on libfoo, so please make sure libfoo version >= 4 is installed’.
In contrast, with FreeBSD’s pkg, Debian’s apt, RedHat’s yum, and so on I can do exactly that and ship a .pkg / .deb / .rpm that pulls in my other dependencies from either the main repo (if they exist) or my repo (if I provide one). On macOS, I can do this via Homebrew, but it’s not part of the core OS. I am not sure if winget supports this on Windows.
I can use vcpkg or Conan on all of these platforms to build libraries that I can bundle with / link against my application, but then I’m not able to use the same shared library that other things may be using.
Lately for me the thing has worked best is CMake + submodules. Where the submodules also use CMake. And I do
add_subdirectory
in CMakeLists.txt. I also have a little ‘run.sh’ script in root that does the CMake invocations or other things.CMake confuses me more than autotools. I can’t even grok the basic grammar of CMakeLists.txt. and why is it a .txt. and what is it a list of?
So beyond me.
The CMake language was originally a very simple macro language. It was translated with simple macro expansion and so didn’t even have a proper concept of scopes. This is how you ended up with things like endif needing the same macro arguments as the if() that it matched. The CMakeLists.txt was a list of macros to apply to generate a build system.
It’s gradually evolved into a not-terrible build configuration language and if you learn modern CMake then you can avoid a lot of the warts, but there’s still a lot of the legacy embedded in the system. It also suffers a bit from the fact that the CMake language used to be far less expressive and so there are a lot of things done in hard-coded bits of logic that should be in packages (e.g. deciding to pass the flag to cl.exe that specifies C or C++, meaning that you can’t then use any other language that clang-cl.exe supports).
CMake is one of the things that I use because I dislike the alternatives more.
I’ve recently used xmake and there are a few things I really like about it (not least that it starts with Lua as the config language), but there are a lot of problems with it. It doesn’t really have a clear abstract model of dependencies and it is not extensible in places where it should be (you can’t define a new kind of target, for example). It also defaults to doing in-tree builds, which I thought we’d all established was a terrible idea 20 years ago.
Have you tried
meson
? I’ve been meaning to try it more.No, but a Python dependency in my build system is a non starter for me. Python is the cause of enough packaging and versioning problems that I can’t imagine it being the solution to any.
Yeah the language doesn’t make much sense to me and I just remember the incantations that have been relevant for me haha. The main thing is just that almost all dependencies I’ve wanted to use have had a CMakeLists.txt. And when they don’t it’s easy enough to create my own library target at the top-level and include their sources directly.
Have you tried
FetchContent
? It can download tarballs/clone Git repositories at configure time and add them to the build. There’s also an argument to make it runfind_package
first and only do the clone if that fails, which can make it easier for people to use the system version of a library (if one exists).I’ve considered it; but I like that submodules are visible in GitHub and to me they feel more predictable (to the extent that submodules do…). It basically feels like vendoring but being able to upgrade the upstream version slightly more easily. Re: using system library when it exists – I actually do want the predictability that vendoring gives me in most cases. And I like the bonus of being able to browse the source code of deps within the same tree / “go to definition” in my editor works to go to source of deps. Some times I may have to fork a dep or just try making a temporary change for debugging (the reality of software dev…) and having code in-tree is really nice for that.
But yeah I think there are situations where
FetchContent
makes sense probably – maybe if your repo tends to itself be a dep or something; or yeah when you do want to use system libraries when available. My projects tend to be applications and so vendoring all the dependencies in and having predictability has been good.this has some setup, but at least sounds clean. Guess I can also use git submodules together with cmake submodules in order to pin versions.
Here’s an example of how I tend to do things – https://github.com/nikki93/raylib-template. The ‘run.sh’ works across macOS, Windows and Linux for me (I use WSL on Windows).