1. 49
  1. 15

    Love it or hate it this is the reason Canonical moved Chromium to a snap in Ubuntu.

    They want users to have up to date secure browsers and it became too difficult to build up to date chromium on old LTS releases. I remember hearing they were having to back port compents of the C tool chain from newer releases to continue building it on 14.04.

    (note I don’t really like how apt install goes and installs the snap on Ubuntu, but I can appreciate the difficult situation)

    1. 9

      The distro philosophy of having as close to 1 version of a library installed and making that work with everything was never going to be able to scale. It may have been kind of workable in the early days but it by the time you get to software like multiple modern web browsers it was going to fail. Snaps, Flatpacks, and other such approaches are much more long term sustaninable no matter how much distros might resist it. The current big ball of mud approach of the distros is going to fail hard.

      1. 2

        I don’t know about this. As far as I can tell its working fine in Arch. That just might be the most viable way for a system to work at scale is just to keep rolling forward and fix things when they break. That’s not want you want for systems that don’t consistently have human users interacting with them, but for systems that have web browsers installed they will have human users.

        1. 3

          Compared to Debian Arch is relatively new. They also give me the impression that they are perfectly fine jettisoning software that can’t keep up. (I don’t use it so this may be a false impression) Most distros don’t take that stance though and they end up with a giant legacy ball of mud that is essentially impossible to maintain over time. So I’m talking about distros like Debian, Redhat, Slackware, and others.

          1. 2

            I’d say the biggest difference between Debian stable and Arch isn’t age, but the nature of their release cycle. Arch doesn’t do releases, and thus has nothing really comparable to Debian stable. A much better comparison would be Arch to Debian unstable which are both effectively rolling releases.

            I don’t believe any of the issues with the browsers mentioned would apply to Debian unstable (although you would have other occasional breakage from package updates).

            1. 1

              Compared to Debian Arch is relatively new

              I love calling an almost 20 year old Distro new, which i guess compared to Debian is true!

            2. 3

              Arch only support X86_64. Debian supports 9 architecture ports. That’s a big source of the bugs to fix.

        2. 15

          It is sad. However, speaking from a security perspective, I believe something as complex and as internet-connected as a browser can not be merely “maintained”. You need active development and that requires a ton of ongoing changes. Otherwise, your browser will be non-functional and insecure very quick.

          I’m definitely not a linux expert and won’t dare any of those statements about a linux distribution that promises longevity and stability, but I’d discourage anyone from trying to do this to a project like a browser.

          imho the most reasonable position for these distributions is to offer installation of a bundled, system-indeptendent version (flatpak? snap?) or none at all. In case of free-software & build-it-yourself policies, I think that leaves you with the latter: Nothing at all. :-/

          What I personally do: install Firefox Nightly in ~/opt/ and let it update itself. But I’m a Mozilla developer and probably a corner-case so take this with a grain of salt :)

          1. 4

            The problem with this view (and I don’t disagree for the record) is that for every subset (of 1 or more people) some package is the most important thing to keep up to date and 95% of the rest of the system can be stable/stale/old/whatever.

            The problem is that everyone has a different set of these special packages. I personally used GNU stow for a while for this, and then nix-packages (on Debian), or in the case of the 2 browsers the official repos.

            But it still sucks. I like stable OS packages but I’d like to pin a few to latest. (And no, apt’s pinning is mostly not what I want, as it pulls in too many OS-level dependencies).

          2. 14


            Normally you would expect someone to backport patches and fixes; but web browser codebases are massive and ugly, so I suspect that’s a really hard job for volunteers. They would possibly have to invent their own fixes too, as upstream might have replaced whole systems within the codebase when fixing the bugs.

            Options I can see:

            • Suddenly summon a vast amount of manpower to backport these patches for the Debian build
            • Convince the bulldozer browser vendors that they should do this work (hah!)
            • Remove the browser packages. Then you’re left with a distro that people will complain about (trading security for social issues). This probably also breaks the “stable” idea.
            • Add a giant warning popup before the browser launches saying that it’s completely insecure and giving the users an option to abort launching it. It’s probably very wise to add a paragraph about why you are doing this (cultures of stable versus rolling browser releases, cost of man hours backporting packages) and another paragraph describing actual practical options to work around this problem (eg moving to Deb testing?).
            • Shoe-horn in an isolated updated system in a box (eg appimage,etc). It “can” work, but it can also cause a thousand other technical issues (new bugs, esp in regards to video drivers & mesa, let alone potential security ones) and it’s probably not as easy as people think. Remember that browsers are essentially a complete operating system of their own, with things like hardware accelerated video decoding that need to cross the divide to your drivers.

            Any other options?

            1. 14

              When I used Debian I just used the Google Chrome deb repo. I used Debian testing, which is what Google tracks internally, so Chrome is guaranteed to work. That is, if Chrome were broken on Debian testing, it would be broken for Google developers. And the Google developer workflow heavily relies on web-based tooling. That’s as close to a “blessed” configuration you can get for web browsers on Linux as far as I know.

              1. 12

                but then you’re introducing an untrusted binary package into the system (untrusted in that it was built by a 3rd party, not from source on debian-owned servers, etc)

                1. 24

                  Yeah, but most people don’t care about that and just want their computers to work. Even as a relatively security-conscious SRE, that includes me.

                  On the list of “people likely to distribute malware-infected binaries,” Google is pretty far down. Unless Chrome falls under your personal definition of malware I suppose.

                  1. 16

                    Yeah I consider Chrome to be malware, but that’s beside the point.

                    1. 8

                      Very much so. It’s amazing how much the goalposts of “malware” have shifted.

                      Chrome is spyware. Having a EULA or opt-in was never a reason for spyware not to be listed by AV tools in the past (at best this might make them get flagged as “PUPS” instead of “spyware”). If Chrome came out from a small company in the 2000’s then it would get flagged.

                      No-one dares mark Chrome as malware. You cannot upset such a large company nor such a large computer base without expecting users to think you are the one at fault. We are not malware, we are an industry leader, you must be mistaken sir :)

                      It seems that you can, indirectly, buy your way out of being considered malware simply by being a big player.

                      1. 4

                        …from a small company in the 2000’s then it would get flagged.

                        I get your point, but c’mon… Stuff got flagged back then because it interrupted what the user was trying to do. If you don’t launch Chrome, you don’t see it, and it doesn’t attempt to interact with you. That’s what most users care about, that’s what most users consider to be malware, and, as far as I recall, that’s (largely) what got apps flagged as malware in the 2000s.

                        1. 2

                          Chrome is like Internet Explorer with all those nasty toolbars installed, except the toolbars are hidden by default ¯\(ツ)/¯.

                    2. 2

                      That’s a silly distinction. If you use Chrome, then you’re already executing tons of arbitrary code from Google. In practice, whether you get Chrome from Debian or Google, you still have no choice but trust Google.

                    3. 1

                      same here, even as a long term Debian user (20+ years), this is just the only way for me, for both the private and regular workstation.

                    4. 12

                      Remove the browser packages.

                      I’d go with that. Well, leave netsurf in so there’s at least a modicum of web browsing functionality OOTB. Motivated users can download Firefox themselves and the world won’t end. That’s what they have to do on windows and macOS. But trying to keep up with the merry go round is like trying to boil the ocean. Then volunteer effort can be spent on an area that the investment will recoup.

                      1. 1

                        In previous Debian releases they had a section in the release notes about how the version of webkit they shipped was known to be behind on security patches and that it was only included so that you could use it to view trusted sources like your own HTML files or whatever. They were very specific about the fact that only Firefox and Chromium were safe to use with untrusted content.

                        But I only found out about it by a friend telling me about it in chat. I have my doubts that this could be communicated effectively.

                      2. 9

                        Normally you would expect someone to backport patches and fixes; but web browser codebases are massive and ugly, so I suspect that’s a really hard job for volunteers. They would possibly have to invent their own fixes too, as upstream might have replaced whole systems within the codebase when fixing the bugs.

                        The article allows us an interesting glimpse into just how hard this is, and it’s not just because of the web browsers:

                        Debian’s official web browser is Mozilla Firefox (the ESR version). The last update of Firefox ESR in Debian stable has been version 78.15.0. This version also has quite a few unpatched security issues and the 78.x ESR branch is not maintained by Mozilla anymore. They need to update to the 91.x ESR branch, which apparently causes big problems in the current stable Debian platform. In an issue, people complain about freezing browser sessions with the 91.x release, which blocks the new Firefox ESR release from being pushed to “stable-security”. Somebody in the issue claims the reason: “Firefox-ESR 91.3 doesn’t use OpenGL GLX anymore. Instead it uses EGL by default. EGL requires at least mesa version 21.x. Debian stable (bullseye) ships with mesa version 20.3.5.”

                        “So just update mesa” doesn’t sound like the kind of thing you could do over just a couple of days, seeing how many packages depend on it. Assuming that even fixes the Firefox end of things, I’m not sure I want to think about how many things could break with that update, not before I’ve had my second coffee of the day in any case. Just testing the usual “I updated mesa and now it crashes/looks funny” suspects – Gnome, Plasma, a bunch of games – takes weeks. It’s something you can do in testing but it takes a while.

                        Large commercial vendors are hitting release management problems like these, too, this is actually part of the reason why you see so many Linux gadgets unironically using tech stacks from three years ago. It’s worse for Debian because they’re trying to build a general-purpose system out of parts that are increasingly made for special-purpose systems that you can either freeze forever (embedded devices) or overwork DevOps teams into PTSD and oblivion in order to keep them running (cloud apps).

                        1. 7
                          • Realize that their current model of Debian slow and “stable” will no longer work in 2021 (and beyond) and change it

                          Not saying Debian should drop stable releases and becoming a rolling release, but perhaps there’s some slightly more rapid cadence they could adopt with releases? Like, is the issue highlighted in the article also a problem with OpenSuSE and Red Hat?

                          1. 4

                            “Stable” means different things to different distros.

                            To Debian, “Stable” means that bugs will be patched, but features and usage will not. This does not fit with Mozilla and Google’s monthly release cadence; all changes need to be checked over by skilled devs.

                            SuSE just builds whatever Mozilla hands them, as far as I can tell.

                            1. 2

                              For Firefox (and some other packages iirc) Debian have already given up on that. They would package the latest Firefox ESR even if it introduced new features (and it would, of course). The issue is even that is an insurmountable amount of work. The latest ESR needs much newer LLVM and Rust toolchain versions than the last one. And Debian also wants to build all packages for a given release with other packages in that release; so that means updating all that stack too.

                              1. 2

                                This is why I don’t really see the point in LTS Linux distros. By a couple of years into their lifetime, the only thing that you’re getting from the stability is needing to install most things that you actually want from a separate repo. If ‘stable’ means ‘does not get security fixes’ then it’s worse than useless. A company like Red Hat might have the resources to do security backports for a large set of packages but even they don’t have that ability for everything in their package repos.

                                It works a bit better in the BSD world, where there’s a strict distinction between the base system and third-party packages, so the base system can provide ABI stability over a multi-year period within a single release but other things can be upgraded. The down side of this is that the stability applies only to the base system. This is great if you’re building an appliance but for anything else you’re likely to have dependencies from the packages that are outside of the base system.

                                1. 1

                                  The Debian stable approach works really well for servers. It works moderately well for desktops, with the very notable exception of web browsers – which are, without a doubt, the most used most exposed most insanely complicated subsystem on any desktop, so much so that Google’s ChromeOS is a tiny set of Linux vital organs supporting Chrome.

                                  Even so, Debian is working on this and within a few weeks, I think, there will be new packages for stable and oldstable and even LTS.

                                  1. 1

                                    I used to think that the “stability” was fine for servers, but it practice it meant that every couple of years I was totally screwed when I had to urgently fix a small thing, but it couldn’t be done without a major upgrade of the whole OS that upset everything. It also encourages having “snowflake” servers, which is problematic on its own.

                                    I feel like the total amount of problems and hassle is the same whether you use a rolling release or snapshots, but snapshot approach forces you to deal with all of them at once. You can’t never upgrade, and software is going to evolve whether you like it or not, so only choice you have is whether you deal with upgrade problems one by one, or all at once.

                              2. 2

                                The Debian release cadence is about 2 years, and has been for 16 years. How much faster would work? What’s Firefox ESR’s cadence? The best I could find from Mozilla was “on average 42 weeks” but I’m not sure that’s quite the right thing. ESR 78 only came out in September this year and is already unsupported. The latest ESR has very different toolchain requirements to build. It’s a confusing picture.

                              3. 1

                                Update mesa, then update Firefox? Fighting upstream like that is a losing battle.

                                1. 1

                                  Agreed, but updating Mesa is easier said than done.

                              4. 8

                                I’m running Slackware which has the same problem to a more extreme degree: the stable version of Slackware is on Firefox ESR 68, because moving to a newer version would require updating things like Rust and gcc for the system to be able to build itself, but that’s a breaking change. So I’m typing this from a mozilla.org build and hoping that Slackware 15 arrives soon.

                                This partly happens because the stable version of Slackware is 5 years old at this point, but it seems mildly concerning to me that 5 years is beyond the length of time that modern software is capable of being supported. Many vendors are now defining “long term” as two or three years - faster than what was previously considered bleeding edge. Mozilla’s ESR page says it’s supported for “more than a year”, but when the OS requires lots of unrelated changes to upgrade, a year is not a long time at all.

                                1. 6

                                  …but it seems mildly concerning to me that 5 years is beyond the length of time that modern software is capable of being supported.

                                  Web browsers are a bit of a special case, though. As mentioned elsewhere in this discussion, they are basically operating systems on their own, with hardware acceleration, etc. They also happen to be one of the most security-sensitive applications most people have installed. This combination of factors makes it unsurprising, to me at least, that they version so fast and are so hard to maintain.

                                  1. 3

                                    they are basically operating systems on their own

                                    Right, that’s why the binaries can run so widely. The browser distributes everything and depends on the host minimally. Firefox 93 runs on Windows 7; even the precompiled Linux binaries only claim to need packages from 2014.

                                    The issue (at least for Slackware) is the moment a distribution wants to be able to distribute source and allow users to compile things for themselves, the developer dependencies need to be considered in addition to runtime dependencies. The developer dependencies for Mozilla have become bleeding edge - it needs a very recent version of Rust, for example, except there’s no guarantee that a new version of Rust is compatible with old code, so distributions can’t support Mozilla source without upgrading but can’t support their users if they do upgrade. For a while Slackware had twin sets of packages - one version of llvm for users, another (in /extra) specifically to compile Firefox.

                                    They also happen to be one of the most security-sensitive applications most people have installed.

                                    That may be true, but how modern is the toolchain needed to build openssl, openssh, apache, or the kernel? I realize others might disagree with me on this, but there seems to be a philosophical difference where the browsers are betting hard on new development tools in a way that other projects just don’t, and it’s not clear to me that it’s necessary. It sure makes it difficult for anyone wanting to contribute a patch, since the development environment is onerous and constantly changing.

                                  2. 4

                                    Mozilla’s ESR page says it’s supported for “more than a year”

                                    And only slightly more than a year at that. ESR 78 was first released on June 30, 2020, and the last release was October 5, 2021, for a 15-month span. Before that, ESR 68 was first released on July 9, 2019, and the last release was August 25, 2020, or about 13½ months.

                                  3. 4

                                    My read on this is that Debian Stable is a dangerous choice for a system that browses the internet now. (It’s probably only ever been a good choice for that in the months immediately following a release.)

                                    I wish they’d just remove the browsers from the stable repositories (maybe leave dillo or netsurf for reading local docs) if it’s not possible to keep even a patched-up ESR. Having a browser that looks capable/supported but isn’t seems like a recipe for trouble.

                                    1. 4

                                      I’ve been using the firefox from the debian experimental repo (currently 94.0.2) and things have been working rather smoothly for me. Doing this just requires adding deb http://deb.debian.org/debian/ experimental main to your sources.list and

                                      # Allow upgrading only my-specific-software from my-custom-repo
                                      Package: firefox
                                      Pin: origin experimental
                                      Pin-Priority: 500

                                      to your preferences directory.

                                      1. 1

                                        What version of Deb are you on? Any weirdness with page rendering, video or audio?

                                        1. 2

                                          Firefox already can’t play audio on my system because of pulseaudio, but I’m worried about the impact of the mesa issues.

                                          edit: Apparently I switched to a manual install of Firefox a while back and forgot I had done so. I haven’t seen any graphical glitches in Firefox, but due to the bug of Firefox not supporting Alsa, I don’t watch any videos in Firefox.

                                          1. 1

                                            Hmm. I don’t use pulse so I also suffer audio problems in FF, but interestingly video audio and embedded audio tags still work fine. It’s only javascript-created audio that fails.

                                            N.B. also check out the apulse project, it’s a compat layer that sometimes seems to work for me. Doesn’t seem to survive suspend/resume on my laptop last I checked, sadly.

                                            1. 1

                                              Excuse my ignorance, but this can’t be just down to pulse can it? I am using linux mint which comes with pulse audio and have never had any sound issues on firefox. I imagine there is more to this story.

                                              1. 1

                                                Sure, I’d imagine there exist sound cards for which pulseaudio is capable of playing sound reliabily. Unfortunately neither of my computers are equipped with one.

                                                edit: to clarify, I removed pulse from my system because it was so unreliable, now I use alsa which works perfectly with literally every program I use … except Firefox, which deleted its alsa support a few years ago, probably on account of it being too stable or something.

                                            2. 1

                                              I use Sid and Firefox’s wayland backend with Sway. I don’t have any audio or video rendering issues (but as always with linux users, it might just be that I got used to them and don’t see them anymore…).

                                          2. 3

                                            This is not ideal, but just get Firefox from Mozilla and run with that. It will auto-update or can be set to make the user aware of an update. Then don’t use APT for a browser until the situation changes.

                                            1. 3


                                              1. 2

                                                Pretty much that: flatpak, appimage, or snap. If distros are moving too slow to update the browsers, and don’t have a maintenance team, something’s gotta give. And I don’t believe they’ll agree to a “let’s not ship a browser” solution.

                                                1. 2

                                                  Ok, to clarify, I think flatpak is superior to appimage or snap. AppImage doesn’t have an update mechanism or repos. Snap is just Ubuntu wanting to do something different instead of working with flatpak.

                                              2. 1

                                                I’m currently running an OS-managed Firefox 95.0 on Linux Mint. Mint doesn’t use snaps by default, so I’m not sure how they manage this, but I’m very thankful that they do.