I started using Debian stable on all my desktop machines when Jessie became stable and never looked back. I’ve used the Firefox ESR in stretch until recently when I upgraded to FF 57 manually. Besides that, Docker, Signal, Riot and Keybase are the things on my machines which are not in the Debian repositories, but that would not really improve with testing or unstable. If I need more up-to-date stuff for development it’s mostly in containers or python virtualenvs these days anyway and I love that my desktops aren’t moving targets anymore.
I used to pull in nix in order to reliably install any software on my system where the version in Debian Stable was too old. But nowadays the only software I don’t get from Debian is the software that I need to build from source because I’m actively contributing to it.
The only exceptions are when I needed Docker for work and when I manually upgraded Firefox to version 57 out of curiosity about how it would break my extensions.
For me the advantage is of political/social nature. Ubuntu is in the end a product and at the mercy of Cannonical just like Fedora is as product of RedHat. If you trust them, they have the advantage of being able to pay much more people then those who are being payed (by 3rd parties) to work on Debian. But I don’t, at least not in the long run: I don’t know for how long my interest as a user will align with their business interests and stuff like the ‘suggestions’ for amazon products in unity or advertisements for proprietary software in MOTD on Ubuntu servers makes me sceptical. I still trust Debians governance much more even after the sub-optimal handling of the systemd debates. Ah, and I am already using Debian on almost all servers, privately and at work ;)
Stability is often undervalued, especially when it comes to Desktop computers.
I think stability is overvalued. I use FreeBSD -CURRENT, LineageOS snapshots, Firefox Nightly, LibreOffice beta, Weston master… and nothing is “buggy as hell”. Seems like developers just don’t write that many bugs these days :)
I that case you must have been lucky. I remember using Arch some years ago, and then it “suddenly” gave up on me. Part of the reason was of course that there were incoherent contradiction between configuration files, some of them were my own fault but others were due to updates. And I really liked updating Arch every day, inspecting what was new, what was better. And it’s good for a while, in my experience, but if you don’t know what you’re doing, you’re too lazy to be minimalist or just don’t have the time and power to properly make sure everything is ok, it breaks. And it only gets worse, the more edge cases you have, the more non-standard setups you need and the more esoteric (or just unsupported/unpopular) hardware you have.
I have similar experiences with Fedora and Debian unstable/testing, albeit to a lesser degree. Debian stable, with a few packages from testing, was a real “shock” in comparison to that, and while it was “boring” it was, at least for me, less stressful and removed a certain tension. I would have certainly learned less about fixing broken X11 setups or configuring init systems, if I had chosen it from the begining, but eventually it is nice to be able to relax.
I agree. My Linux desktop history from 1994 onwards was roughly Slackware -> Debian -> Ubuntu -> CentOS -> Debian -> Fedora -> Arch. I didn’t find more cutting-edge distributions such as Fedora or a rolling distribution such as Arch to be less stable than the conservative (Slackware) and stable distributions (Ubuntu LTS, CentOS, Debian).
Moreover, I found that Fedora and Arch receive fixes for bugs far more quickly. When you report bugs upstream and they are in Fedora/Arch in typically a few days/weeks, while in some conservative distributions it could take months or years. Besides that the hardware support is generally better. E.g., the amdgpu drivers work much better on my AMD FirePro the older radeon driver, but it might literally take years for amdgpu Southern/Sea Islands cards to land in stable distributions.
I started using Debian Stable for my desktop after I unexpectedly had Arch fail to boot to X* (again) right as I was struggling to hit a major paper deadline.
Previously, I’d switched from Ubuntu to Arch because it let me keep up-to-date packages without the headache of Ubuntu’s dist-upgrade (and incredibly premature use of things like pulseaudio and Unity). It worked 99% of the time, but that 1% nearly fucked me over in a big way.
I’ve been running Debian Stable for two years now and have yet to ever have it fail to boot to X. During paper deadlines, this is wonderful because if I happen to need to update a library in order to make someone’s code compile, I can just do it and be confident that it won’t cost hours of time getting my system to boot up again.
(* When Arch broke, it was because I had to update a library (libigraph if memory serves), which in turn necessitated updating libc, which cascaded into updates everywhere and then lo-and-behold the system couldn’t fully boot until I tracked down a change in how systemd user units worked post-update.)
If you’re using backports on top of stable then you’re effectively using a less-popular, less-well-tested variant of testing.
In theory regular stable releases make sense for a distribution that extensively patches and integrates the software it distributes. But given that Debian’s policy and practices predictably lead to major security vulnerabilities like their SSH key vulnerability, I figure such patching and integrating is worse than useless, and prefer distributions that ship “vanilla” upstream software as far as possible. Such distributions have much less need for a slow stable release cadence like Debian’s, because there’s far less modification and integration to be doing.
a less-popular, less-well-tested variant of testing.
Not at all. Going to testing means moving everything to testing. Moving Linux, moving gcc, moving libc. Stable + backports means almost everything is on stable except the things you explicitly move to backports. My current package distribution is:
stretch: 5323
stretch-backports: 7
The 7 packages I have from backports are: ldc, liboctave-dev, liboctave4, libphobos2-ldc-dev, libphobos2-ldc72, octave, and octave-common. Just Octave and the LDC compiler for D. Hardly could call them important system packages.
It’s worth remembering that the purpose of the computer is to run user programs, not to run the OS. I’d suggest that the programs a user enables backports for are likely to be those programs the user cares most about - precisely the most important packages.
5
JordiGHGNU Octave maintainer
edited
5 years ago
|
link
I am running stable because I don’t want to have distracting glitches on the side of the things I actually care about. I have the energy to chase after D or Octave bugs (after all, it’s kind of what I do), so I do want newer things of those. I don’t want to be chasing after Gnome or graphics driver bugs. Those system things get frozen so I can focus on the things I have the energy for.
As a maintainer you’re in a rather unusual position; you’re, in a sense, running Octave for the sake of running Octave. Whereas most people with Octave installed are probably using Octave to do something, in which case Octave bugs would be a serious issue for them, probably more so than bugs in Gnome or graphics drivers.
Making changes to security-critical code without having them audited specifically from a security perspective will predictably result in security vulnerabilities. Preferred alternatives would be either to have a dedicated, qualified security team review all Debian changes to security-critical code, or to exempt security-critical code from Debian’s policy of aggressively patching upstream code to comply with Debian policy. Tools like OpenSSH do by and large receive adequate security review but those researchers and security professional work with the “vanilla” source from upstream; no-one qualified is reviewing the Debian-patched version of OpenSSH and that’s still true even after one of the biggest security vulnerabilities in software history.
I started using Debian stable on all my desktop machines when Jessie became stable and never looked back. I’ve used the Firefox ESR in stretch until recently when I upgraded to FF 57 manually. Besides that, Docker, Signal, Riot and Keybase are the things on my machines which are not in the Debian repositories, but that would not really improve with testing or unstable. If I need more up-to-date stuff for development it’s mostly in containers or python virtualenvs these days anyway and I love that my desktops aren’t moving targets anymore.
I used to pull in nix in order to reliably install any software on my system where the version in Debian Stable was too old. But nowadays the only software I don’t get from Debian is the software that I need to build from source because I’m actively contributing to it.
The only exceptions are when I needed Docker for work and when I manually upgraded Firefox to version 57 out of curiosity about how it would break my extensions.
I use Ubuntu LTS for the same reasons. I do get the latest Firefox automatically.
Do you see any advantage in Debian stable over Ubuntu LTS?
For me the advantage is of political/social nature. Ubuntu is in the end a product and at the mercy of Cannonical just like Fedora is as product of RedHat. If you trust them, they have the advantage of being able to pay much more people then those who are being payed (by 3rd parties) to work on Debian. But I don’t, at least not in the long run: I don’t know for how long my interest as a user will align with their business interests and stuff like the ‘suggestions’ for amazon products in unity or advertisements for proprietary software in MOTD on Ubuntu servers makes me sceptical. I still trust Debians governance much more even after the sub-optimal handling of the systemd debates. Ah, and I am already using Debian on almost all servers, privately and at work ;)
I think stability is overvalued. I use FreeBSD -CURRENT, LineageOS snapshots, Firefox Nightly, LibreOffice beta, Weston master… and nothing is “buggy as hell”. Seems like developers just don’t write that many bugs these days :)
I that case you must have been lucky. I remember using Arch some years ago, and then it “suddenly” gave up on me. Part of the reason was of course that there were incoherent contradiction between configuration files, some of them were my own fault but others were due to updates. And I really liked updating Arch every day, inspecting what was new, what was better. And it’s good for a while, in my experience, but if you don’t know what you’re doing, you’re too lazy to be minimalist or just don’t have the time and power to properly make sure everything is ok, it breaks. And it only gets worse, the more edge cases you have, the more non-standard setups you need and the more esoteric (or just unsupported/unpopular) hardware you have.
I have similar experiences with Fedora and Debian unstable/testing, albeit to a lesser degree. Debian stable, with a few packages from testing, was a real “shock” in comparison to that, and while it was “boring” it was, at least for me, less stressful and removed a certain tension. I would have certainly learned less about fixing broken X11 setups or configuring init systems, if I had chosen it from the begining, but eventually it is nice to be able to relax.
I agree. My Linux desktop history from 1994 onwards was roughly Slackware -> Debian -> Ubuntu -> CentOS -> Debian -> Fedora -> Arch. I didn’t find more cutting-edge distributions such as Fedora or a rolling distribution such as Arch to be less stable than the conservative (Slackware) and stable distributions (Ubuntu LTS, CentOS, Debian).
Moreover, I found that Fedora and Arch receive fixes for bugs far more quickly. When you report bugs upstream and they are in Fedora/Arch in typically a few days/weeks, while in some conservative distributions it could take months or years. Besides that the hardware support is generally better. E.g., the amdgpu drivers work much better on my AMD FirePro the older radeon driver, but it might literally take years for amdgpu Southern/Sea Islands cards to land in stable distributions.
I started using Debian Stable for my desktop after I unexpectedly had Arch fail to boot to X* (again) right as I was struggling to hit a major paper deadline.
Previously, I’d switched from Ubuntu to Arch because it let me keep up-to-date packages without the headache of Ubuntu’s dist-upgrade (and incredibly premature use of things like pulseaudio and Unity). It worked 99% of the time, but that 1% nearly fucked me over in a big way.
I’ve been running Debian Stable for two years now and have yet to ever have it fail to boot to X. During paper deadlines, this is wonderful because if I happen to need to update a library in order to make someone’s code compile, I can just do it and be confident that it won’t cost hours of time getting my system to boot up again.
(* When Arch broke, it was because I had to update a library (
libigraph
if memory serves), which in turn necessitated updating libc, which cascaded into updates everywhere and then lo-and-behold the system couldn’t fully boot until I tracked down a change in how systemd user units worked post-update.)If you’re using backports on top of stable then you’re effectively using a less-popular, less-well-tested variant of testing.
In theory regular stable releases make sense for a distribution that extensively patches and integrates the software it distributes. But given that Debian’s policy and practices predictably lead to major security vulnerabilities like their SSH key vulnerability, I figure such patching and integrating is worse than useless, and prefer distributions that ship “vanilla” upstream software as far as possible. Such distributions have much less need for a slow stable release cadence like Debian’s, because there’s far less modification and integration to be doing.
Not at all. Going to testing means moving everything to testing. Moving Linux, moving gcc, moving libc. Stable + backports means almost everything is on stable except the things you explicitly move to backports. My current package distribution is:
The 7 packages I have from backports are: ldc, liboctave-dev, liboctave4, libphobos2-ldc-dev, libphobos2-ldc72, octave, and octave-common. Just Octave and the LDC compiler for D. Hardly could call them important system packages.
It’s worth remembering that the purpose of the computer is to run user programs, not to run the OS. I’d suggest that the programs a user enables backports for are likely to be those programs the user cares most about - precisely the most important packages.
I am running stable because I don’t want to have distracting glitches on the side of the things I actually care about. I have the energy to chase after D or Octave bugs (after all, it’s kind of what I do), so I do want newer things of those. I don’t want to be chasing after Gnome or graphics driver bugs. Those system things get frozen so I can focus on the things I have the energy for.
As a maintainer you’re in a rather unusual position; you’re, in a sense, running Octave for the sake of running Octave. Whereas most people with Octave installed are probably using Octave to do something, in which case Octave bugs would be a serious issue for them, probably more so than bugs in Gnome or graphics drivers.
Could you elaborate on that? How do those policies and practices do so predictably? And what would preferable alternatives look like in your opinion?
Making changes to security-critical code without having them audited specifically from a security perspective will predictably result in security vulnerabilities. Preferred alternatives would be either to have a dedicated, qualified security team review all Debian changes to security-critical code, or to exempt security-critical code from Debian’s policy of aggressively patching upstream code to comply with Debian policy. Tools like OpenSSH do by and large receive adequate security review but those researchers and security professional work with the “vanilla” source from upstream; no-one qualified is reviewing the Debian-patched version of OpenSSH and that’s still true even after one of the biggest security vulnerabilities in software history.