As a Gentoo user I regularly witness hours-long chromium compile times and know the pain all too well, even when it’s running in the background. Isn’t it scary to think that we might reach new extremes in chromium compile times now?
From what I heard is that this might not be a relevant metric to the Chromium project?
AFAIU most of the contributors compile “in the cloud” and don’t care about downstream compilation :/
I have first-hand experience with this. My compiling machine was a jaw-droppingly beefy workstation which sat at my desk. A cached compilation cycle could take over an hour if it integrated all of Chromium OS, but there were ways to only test the units that I was working on. Maybe this is a disappointing answer – Google threw money at the problem, and we relied upon Conway’s Law to isolate our individual contributions.
Either you own your computers or you do not. If I need a datacenter/cluster to make software compilation at least half-bearable, the problem is software complexity/compilation speed and not too little compuational power. And even in the cloud it takes many minutes, also for incremental builds.
The first step to solving a problem is admitting there is one.
The reason Chrome exists is to allow a multinational advertising company to run their software on your computer more efficiently. Slow compile times are not a problem they experience and they are not a problem they will address for you.
I agree that the software complexity has grown. I don’t necessarily think of it as a problem though. Chromium to me is the new OS of the web. You build apps running on top of this OS and there are complex security, performance, and multi-tenancies concerns.
IMO, modern software has gotten complicated but that simply follows a natural growth as technologies mature and progress over time. You could solve this “problem” by inventing better tools, and better abstractions… that help you navigate the complexity better. But I don’t think reducing the complexity is always possible.
This is where I fundamentally disagree. There are many good examples which tell that complexity is often unnecessary. Over time, software tends to build up layers over layers, often completely for no reason other than historic growth and cruft.
The system will not collapse, I think, but fall into disrepair. Currently, there is enough money in companies to pay armies of developers to keep this going, but now that we are already deep in a recession and going into a depression, it might be that the workload to feed these behemoths exceeds the available manpower. This might then motivate companies to give more weight to simplicity as it directly affects the bottom end and competitivity.
Systems tend to collapse, replacing complex mechanisms with simpler equivalents. This used to be called “systems collapse theory” but apparently is now called collapsology. For example, we are seeing an ongoing migration away from C, C++, and other memory-unsafe languages; the complexity of manual memory safety is collapsing and being replaced with automatic memory management techniques.
This is a bit like saying “well the browser already has so many features: versions of HTML and CSS to support, Bluetooth, etc. – could adding more make it substantially worse?”
Yes, it could – there’s no upper bound on compile times, just like there’s no upper bound on “reckless” features.
That said, I only use Chrome as a backup browser, so meh
Is there any evidence to support the claim that replacing C++ with Rust code would substantially slow down compile times? As someone who writes C++ code every day and also has done some Rust projects that see daily use at work, I really don’t see much of a difference in terms of compile time. It’s slow for both languages.
Using only leaf dependencies makes sense. They can be built and tested in isolation, which is faster to build and more convenient to work on than working with entirety of Chromium as a monolith.
Mozilla managed to make CSS processing and GPU-accelerated rendering as libraries, so this approach could be used for substantial chunks of the engine if they wanted to.
I don’t think this allows rewriting CSS processing in Chromium to Rust. Mozilla’s CSS processing in Rust is not a leaf dependency. It necessarily depends on DOM code written in C++ and there are lots of callbacks from Rust to C++, which Chromium wants to avoid.
I know Rust has many advantages over C++, but I’m already worried about the compile times…
chromium has one of the longest compile times of any open source project, I’m not sure rust would make it substantially worse.
As a Gentoo user I regularly witness hours-long chromium compile times and know the pain all too well, even when it’s running in the background. Isn’t it scary to think that we might reach new extremes in chromium compile times now?
From what I heard is that this might not be a relevant metric to the Chromium project? AFAIU most of the contributors compile “in the cloud” and don’t care about downstream compilation :/
I have first-hand experience with this. My compiling machine was a jaw-droppingly beefy workstation which sat at my desk. A cached compilation cycle could take over an hour if it integrated all of Chromium OS, but there were ways to only test the units that I was working on. Maybe this is a disappointing answer – Google threw money at the problem, and we relied upon Conway’s Law to isolate our individual contributions.
Chromium’s build tool Goma support distributed remote build. So as long as you have a server farm to support Goma, the build is actually pretty fast.
Similarly, at Mozilla they used distcc+ccache with Rust. So the compilation has always been distributed to the data center instead of running locally.
Either you own your computers or you do not. If I need a datacenter/cluster to make software compilation at least half-bearable, the problem is software complexity/compilation speed and not too little compuational power. And even in the cloud it takes many minutes, also for incremental builds.
The first step to solving a problem is admitting there is one.
The reason Chrome exists is to allow a multinational advertising company to run their software on your computer more efficiently. Slow compile times are not a problem they experience and they are not a problem they will address for you.
I agree that the software complexity has grown. I don’t necessarily think of it as a problem though. Chromium to me is the new OS of the web. You build apps running on top of this OS and there are complex security, performance, and multi-tenancies concerns.
IMO, modern software has gotten complicated but that simply follows a natural growth as technologies mature and progress over time. You could solve this “problem” by inventing better tools, and better abstractions… that help you navigate the complexity better. But I don’t think reducing the complexity is always possible.
This is where I fundamentally disagree. There are many good examples which tell that complexity is often unnecessary. Over time, software tends to build up layers over layers, often completely for no reason other than historic growth and cruft.
The system will not collapse, I think, but fall into disrepair. Currently, there is enough money in companies to pay armies of developers to keep this going, but now that we are already deep in a recession and going into a depression, it might be that the workload to feed these behemoths exceeds the available manpower. This might then motivate companies to give more weight to simplicity as it directly affects the bottom end and competitivity.
Systems tend to collapse, replacing complex mechanisms with simpler equivalents. This used to be called “systems collapse theory” but apparently is now called collapsology. For example, we are seeing an ongoing migration away from C, C++, and other memory-unsafe languages; the complexity of manual memory safety is collapsing and being replaced with automatic memory management techniques.
Thanks for the pointer (pun intended!) at collapsology!
This is a bit like saying “well the browser already has so many features: versions of HTML and CSS to support, Bluetooth, etc. – could adding more make it substantially worse?”
Yes, it could – there’s no upper bound on compile times, just like there’s no upper bound on “reckless” features.
That said, I only use Chrome as a backup browser, so meh
Rust compile times are really good now. It’s not C fast but a lot better than C++. At this point it’s a non-issue for me.
(YMMV, depends on dependencies, etc etc)
Is there any evidence to support the claim that replacing C++ with Rust code would substantially slow down compile times? As someone who writes C++ code every day and also has done some Rust projects that see daily use at work, I really don’t see much of a difference in terms of compile time. It’s slow for both languages.
In the context of gentoo, you now need to compile both clang and rustc, this probably is a substantial increase.
It appears to be about the same, perhaps slightly worse.
AFAIR clean build in Rust and C++ has similar performance metrics.
Builds would be way quicker if they just ported everything from C++ to C.
Memory safety problems would be way worse, but this thread seems unconcerned with that question.
Using only leaf dependencies makes sense. They can be built and tested in isolation, which is faster to build and more convenient to work on than working with entirety of Chromium as a monolith.
Mozilla managed to make CSS processing and GPU-accelerated rendering as libraries, so this approach could be used for substantial chunks of the engine if they wanted to.
I don’t think this allows rewriting CSS processing in Chromium to Rust. Mozilla’s CSS processing in Rust is not a leaf dependency. It necessarily depends on DOM code written in C++ and there are lots of callbacks from Rust to C++, which Chromium wants to avoid.