“We need a more complex build system” seems to be a pretty common trap. “We had our reasons” is always offered, but some years later there’s always a monumental undertaking to replace it. There’s a few projects I’m thinking of where building resembles an iterative process of building tools to build to tools to generate python scripts that create make files that build tools… “But that was the only way to create a debug build for windows!”
You think you have a problem nobody else has? You open this door, you’re really going to have problems nobody else has.
I saw several times this pattern and as a junior, I’ve often blamed my naiveness in favor of trusting more experienced people. Although, it still feels like “we’re doing something very complicated to do something that should be simple”.
At the end I’m always told that this is because it’s more flexible and will adapt better if the packaging needs shift.
Which I find odd, since it’s making the whole company slower everyday for a “future” need that nobody knows what it will be.
So exacly like you said, it seems a good idea at first but finally everybody has problem with this…
For real-world use it depends on the use case; after all these years, there is unfortunately still no one-size-fits-all C/C++ build system. For personal projects I usually use GNU make (non-recursive).
I keep hoping that something like tup (but a little less minimalist) will take the stage one day, but it hasn’t happened yet.
When a distribution starts messing with your dependencies, all your QA goes out the window
Guix addresses this by running the package’s tests as part of the build.
Developers love npm or NuGet because it’s so easy to consume – asking them to abandon those tools is a significant impediment to developer flow.
Guix is aware of this issue too and provides import tools to address it… but it’s still not enough for application development and deployment.
I don’t think it’s unreasonable (from an application developer’s perspective) to want to bundle an ever-increasing number of dependencies. I read an article yesterday advocating bundling an extra virtual machine as a build step, like that was a normal and sane thing to do.
“We need a more complex build system” seems to be a pretty common trap. “We had our reasons” is always offered, but some years later there’s always a monumental undertaking to replace it. There’s a few projects I’m thinking of where building resembles an iterative process of building tools to build to tools to generate python scripts that create make files that build tools… “But that was the only way to create a debug build for windows!”
You think you have a problem nobody else has? You open this door, you’re really going to have problems nobody else has.
I saw several times this pattern and as a junior, I’ve often blamed my naiveness in favor of trusting more experienced people. Although, it still feels like “we’re doing something very complicated to do something that should be simple”. At the end I’m always told that this is because it’s more flexible and will adapt better if the packaging needs shift. Which I find odd, since it’s making the whole company slower everyday for a “future” need that nobody knows what it will be.
So exacly like you said, it seems a good idea at first but finally everybody has problem with this…
I have no data to support this theory, but I feel like this mostly happens because people don’t want to learn cmake (or don’t know that they should).
Yes, people should learn cmake, if only to learn how awfully limiting it is.
And you would suggest?
For real-world use it depends on the use case; after all these years, there is unfortunately still no one-size-fits-all C/C++ build system. For personal projects I usually use GNU make (non-recursive).
I keep hoping that something like tup (but a little less minimalist) will take the stage one day, but it hasn’t happened yet.
On the other hand, if you never face problems that nobody else had before, you are not doing anything new.
Here’s an interesting blog post on Debian versus current development practices, from a Debian perspective.
Great read. It’s a well reasoned opinion.
This will be my next favourite quote for the next 7 years.
I’m young again! :-)
Guix addresses this by running the package’s tests as part of the build.
Guix is aware of this issue too and provides import tools to address it… but it’s still not enough for application development and deployment.
I don’t think it’s unreasonable (from an application developer’s perspective) to want to bundle an ever-increasing number of dependencies. I read an article yesterday advocating bundling an extra virtual machine as a build step, like that was a normal and sane thing to do.