I have watched the videos behind this text and I’m a bit frustrated. The most problems they have are either hardware problems or problems because they expect thinks work like on Windows (or believe they work on Windows).
For the hardware the somehow acknowledge that this is more the problem of the vendors then of Linux. It still sounds the most time more like Linux is bad because this super fancy hardware don’t work. Yes I know the problems behind this are complex and as a normal user this is frustrating.
And of course they expect a Windows like behavior, they have used Windows for years. What bugs me is that they claim that the Windows way is the better way without understanding what the problem is. There are two examples for this:
First the Linus broke his Pop!_OS installation while he tried to install steam. This was because the steam package had a dependency problem which could only resolved by removing essential packages. The GUI tells him there was an error with some suggestions what might case the problems and output from apt hidden behind a details button. He reads out loud: “Warning: you trying to remove the following essential packages”. So he googled and found the command line to install steam. So the command prompted him a lot of text and at the end following two lines:
You are about to do something potentially harmful
To continue type in the phrase ‘Yes, do as I say!’
So he typed in “Yes, do as I say!” and his installation was broken. He claimed later: “the thinks that I did are not entirely ridiculous or unreasonable”. He ignored all warnings and “dictated the computer” “Yes, do as I say!”, how is this not a clear user error[0]?
So lets look what would had happen with a similar issue under Windows. First so similar we don’t get the issue, because under Windows there is no package manager accessible for somehow third party software. So lets assume there is an windows update which removes the wrong file and breaks your system. On the install the update would remove the wrong file and breaks your system. Other example the steam installer manage to have a bug with removes some necessary files from your Windows installation. Is there anything Windows protect you from this bug[1]?
It’s late and the other stuff about the file exertion issue I might write tomorrow.
[0] Of course this has also something to do with the spirit of some developers to create popups/warnings/info messages. Which leads users to ignore these messages.
[1] I don’t know, but a few years ago windows installers where just executable which required to run as administrator.
And of course they expect a Windows like behavior, they have used Windows for years
I think the “Windows-like behaviour” in this case is that on Windows Steam works perfectly, you don’t have to think about installing it, there’s no chance it’s going to break your OS, nor will you have to you choose between installing an application you want and having a working OS.
We could imagine a hypothetical Steam bug that somehow wrecks Windows installations, but in reality those don’t exist.
I think those kinds of comparisons don’t work very well, because of the range of options. For the Steam installation issue, on windows you basically have two options: you install it and it works or it doesn’t. In Linux you have the same two options + playing around with various tweaks and installation methods.
If we were going with a typical user windows-like approach, he’d declare it a failure after Steam failed to install from the default source. Going further with other solutions is both a good thing because it’s possible and a bad thing, because newbies get into a situation like a broken desktop. So once you start going past the basics it’s really on the user to understand what they’re doing. Otherwise it’s comparable to “I messed with some windows DLLs / registry trying to get Steam to work, despite warnings and now it doesn’t boot” - but that’s just not something average users do.
on windows you basically have two options: you install it and it works or it doesn’t
On Windows you install Steam and it works. Installing Steam and it not working isn’t really an experience people have with Steam on Windows.
In Linux you have the same two options + playing around with various tweaks and installation methods.
I guess? But the Linux (Pop!_OS?) equivalent of “I messed with some windows DLLs / registry trying to get Steam to work, despite warnings and now it doesn’t boot” is [0] kind of the only experience that was available? It seems like there was no way to install it and have it work, or even install it and have it just not work. The only way to install it broke the OS?
[0] Disclaimer: I didn’t watch the videos, so I’m going off my understanding of the comment I originally replied to
Installing Steam and it not working isn’t really an experience people have with Steam on Windows.
Not just that but you actually do have a lot of tweaks to play around with. They’re not common knowledge because it’s incredibly rare to need it in order to get something like Steam working. You don’t really need them unless you’re developing software for Windows.
I had this “it’s a black box” impression for a long time but 10+ years ago I worked in a Windows-only shopped that did a lot of malware analysis and the like. It’s quite foreign, since it comes from a different heritage, but the array of “power tools” you have on Windows is comparable to that of Linux. The fact that typical users don’t need them as frequently is a good thing, not an evil conspiracy of closed-source vendors to make sure you don’t have control over your hardware.
Installing Steam and it not working isn’t really an experience people have with Steam on Windows.
That’s a bit hard to quantify, but sure they do. Just search for “Steam won’t start” or “steam installer fails” on Reddit or their forums. It’s also common enough for many SEO-spam sites to have listicles for that phrase that are actually steam-specific.
And my point was that this wasn’t there only experience available. The alternative was not to type “yes I’m sure I know what I’m doing” (or whatever the phrase was) when he did not. He went out of his way to break the system after the GUI installer refused to do it. I think you really should watch they fragment for the discussion context.
Of course with a simple installer (copy all files to a directory and add an entry to the windows registry) it’s quite hard to have a bug with breaks your OS. But a simple installer don’t have the features a package management system, i.e. central update mechanism. I don’t want to say package manager are better then the installer way used on Windows[0]. The problem I have with this case it’s not he has clicked some random button and then everything was broken. He has read the error, ignored all warnings and typed the prompt char by char and then wounder why it’s goes wrong.
I don’t say the UI[1] is perfect. The problem I have is this “I ignore all warnings and complain if it goes wrong” mentality[2]. apt is not a program witch bugs you with unnecessary questions or warnings. Install a package only ask for conformation if it does more then only install the requested package. The annoying conformation question is there only if you try to remove essential packages and is designed to give enough hassle to bring the user to question about this commands.
[0] I think systems with package manager are better, but this is not the point of the comment
[1] The error message in the GUI and the handling in the command line
[2] Yes some (or most) users don’t want to understand error messages, but shouldn’t they not stop at the error and look for (professional) help? And no copy paste a command from a random blog post is not help, if you don’t understand the error or the blog post.
The entire point of Linus’ challenge is that desktop Linux is full of barriers and traps for new users who don’t (yet) know what they’re doing.
Explaining “well, it’s like that because you told it to florb the waggis instead of confeling rolizins, so it’s all your fault” may very well be technically correct, but it doesn’t change the fact that the OS hasn’t worked well for the user. “I want to install Steam in 5 minutes without learning about package sudoku solvers, or bricking my computer” is an entirely reasonable use-case.
Web dev community had a reckoning with this, and thinking has changed from “users are too stupid to understand my precious site” to “all my new users know only other sites, so I must meet their expectations”. If Linux wants to get new users it needs to be prepared for users who know only Windows, macOS, or even just Android/iOS.
Explaining “well, it’s like that because you told it to florb the waggis instead of confeling rolizins, so it’s all your fault” may very well be technically correct, but it doesn’t change the fact that the OS hasn’t worked well for the user. “I want to install Steam in 5 minutes without learning about package sudoku solvers, or bricking my computer” is an entirely reasonable use-case.
That’s well and good, but there is a perfectly good fast path for this; install Pop!_OS or Ubuntu on a day where there’s not a bug in the packaging system, which is the vast majority of all days. Yep, it sucks that there was a bug, but that’s simply not going to affect anyone going forward - so why are LTT giving advice based on it?
For every distro D there exists a problem P that is solved in a distro E.
That endless cycle of “then abandon your whole OS and install a completely new one” thing is another annoying problem “Linux desktop” has. It’s not any single distro’s fault, but it’s a pain that users need to deal with.
In my case: I want to use Elementary, but I hosed it trying to update Nvidia drivers. So I was told to switch to Pop!_OS — they do it right. But this one gets stuck seemingly forever when trying to partition my disk, presumably because of the combination of NVMe and SATA that I have. Manjaro worked with my disks, but I’ve run into bugs in its window manager, which wouldn’t be an issue in Elementary. I still use macOS.
For every distro D there exists a problem P that is solved in a distro E.
Right, I agree that in general this is a problem; we need better ways to integrate the best ideas from multiple projects.
But for the problem stated, which was “I want to install Steam in 5 minutes without learning about package sudoku solvers, or bricking my computer”, Pop!_OS or Ubuntu are the way to go. Your problem is not that; it’s “I want Pantheon and a fast Nvidia card,” and Nvidia have intentionally made that harder than it needs to be.
To be totally clear, I’m under no illusions that every user can simply pick up a free desktop and be on their way, but I think it’s pretty unhelpful to cultivate a discourse which simultaneously says “Users should have a fast path for these common use cases” and “Users should be able to get whichever window manager, packaging system, and customizations they want.” Those are both valuable goals, but the former inherently precludes the latter, especially in a world where some hardware companies, like Nvidia, are actively hostile to free desktop projects.
I switched from windows to Mint a couple of years back for gaming, in a similar experiment to this one (only not so public). I had no issues at all, steam was in the official applications, it installed with one click. Every game that steam claimed worked on linux did work. There were issues with my non-standard multi-monitor set up (there were issues with this in windows too, but they were worse in linux*) but nothing that prevented playing the games. It was only once I enabled the steam beta program which sets steam to attempt to open all games in wine that I had to get down in the weeds with configuring stuff and some things didn’t work. Steam has pretty clear warnings about this when you turn it on though.
I feel like for a tech tips site those guys are pretty non-technical. I never really watched their stuff anyway but now it seems like they should be calling me for help (and I am pretty noob when it comes to linux). This is the biggest criticism for me of this whole experiment. If these guys are an authority on computer tech informing users, they should simply be better at what they do. It is almost like they are running an investment advice channel and going ‘oh no I lost all my money investing in a random startup, guys don’t do the stockmarket it’s broken’. They should be informing people interested in linux what to do and what not to do, and if they are not qualified to do that they should state that and recommend alternatives sources of advice.
*I have a suspicion most of these issues were on the application level not the OS level. Games were probably getting the monitors list from the wrong place. Ironically once I set my monitors up in the way that the developers on both windows and linux were expecting me too, the problems on linux disappeared, but a few small issues persisted on windows.
As a waggis / rolizins engineer, maybe I’m out of touch, but I don’t think “Doing this will cause everything to break. If you want everything to break, then type ‘Yes, please cause my computer to break!’” is quite as obscure a message as anything about florbing and confeling. This required not only a (very rare) bug in the dependency tree but also either a user that deliberately wanted to break his Linux install for YouTube content, or one that is the very embodiment of Dunning-Kruger.
Not only did the dependency tree break, but the package manager was smart enough to recognize that the dependency tree had broken, and stopped him from doing it and told him so. He then went out of his way and used another package management tool to override this and allow him to break his installation anyway. This tool then was also smart enough to recognize the dependency tree was broken, and again warned him what was about to happen. He read this message and copied the text from this warning into a confirmation prompt.
He could just as easily have typed sudo rm -rf /usr. He could just as easily have deleted system32 on Windows.
The only possible solution that could have prevented him from doing this would be to not tell him his own sudo password and to give him a babysitter to do everything requiring privilege escalation for him so he doesn’t hurt himself, but that solution has logistical issues when you try to scale it up to every desktop Linux user.
The prompt wasn’t “destroy my system”, it was “do as I say”, and user said to install Steam.
No other operating system is stupid enough to delete itself when you tell it to add a new good application from a reputable publisher. Crappy Windows installers could damage the OS, but Steam doesn’t.
It’s normal for OSes to sound dramatic and ask for extra confirmation when installing software from unusual sources, so the alarming prompt could easily be interpreted as Linux also warning about dangers of “sideloading” a package, which can be dismissed as “I’m not installing malware, just Steam, so it’s fine”.
From user perspective the screen contained “Install Steam, wall of technogibberish user didn’t ask nor care for, type ‘Yes, do as I say!’”. The system frequently requires to type weird commands, so it requiring to type one more weird command wasn’t out of ordinary.
The only possible solution… [condescending user blaming]
The real solution would be for Linux to work properly in the first place, and actually install Steam instead of making excuses. Linux is just an awful target for 3rd party applications, and even the other Linus knows this.
No other operating system is willing to give the user the ability to break the desktop environment intentionally (though I recall a lot of Windows bugs in the past that did this unintentionally). One of the fundamental problems Linux faces is that most users don’t actually want as much power as running as root gives you. They’ll say the do, but they really don’t, and their operating system choice generally reflects that.
It’s normal for OSes to sound dramatic and ask for extra confirmation when installing software from unusual sources
This is pretty obviously because it’s axiomatically impossible for the OS to actually tell if something the user does with sufficient privileges will break something (inter alia, you’d have to be able to solve the Halting Problem to do this). In this case, the package manager was obviously correct, which should be applauded. There are two obvious responses to this (maybe there are non-obvious ones I’m missing as well): restrict the user’s ability to do things to actions with a low likelihood of breaking the OS or trust the user to make a decision and accept the consequences after a warning.
Broadly, Windows, the MacOSs, and the mobile operating systems have been moving towards restricting the user’s ability to do risky things (which also includes a lot of things proficient system operators want to do). That seems to be in response to consumer demand but I don’t think that we should enshrine the desires of the bottom 60% of users (in terms of system operation competence) as the standard to which all systems should be designed. This is not related to an “it should just work” attitude towards 3rd party software as there’s generally been a significant decrease in things like OS API stability over the past two decades (e.g. this rant of Spolsky’s). Users just think that anything they want to use should “just work” while anything they don’t care about should be marginally supported to minimize cost: the problem is that many people want different things to work.
On the other hand, some users don’t want the operating system reasoning for them (at least some of the time). I don’t want an operating system “smart” enough to prevent me from doing something stupid on my project boxes or in a VM I’m playing with especially if it’s just something that looks stupid to the OS but I’m doing for some (moderately good) reason.
You’re boxing this into a dichotomy of restricting user or not, but this isn’t the issue here.
The issue is not about power, but about usability. You don’t need to block all failure paths. You need to communicate clearly and steer users towards success paths, so they never even come close to the dangerous commands by accident.
I wouldn’t say this is really about power, so much as control though I tend to be a bit of a pedant about defining “power”.
I think the communication here was reasonably good, though it could be improved. I think the real mistake Linus made was in choice of distribution. That is a real problem in the Linux community (and I think the one we should be focused upon here). I think the opportunity to improve communication here is marginal at best.
I do. I’m just saying that there’s nothing anyone could have done to prevent this except disallow even the root user from uninstalling xorg, and even then he could have just manually removed crucial files if he felt like it. OS maintainers are going to make mistakes occasionally. “Just don’t make mistakes ever” isn’t a viable strategy for avoiding things like this. What is a viable strategy is to build tools that detect and correct for errors like the one in Pop!_OS’s dependency tree. And that’s exactly what happened. He just disregarded the numerous protections from this bug that his OS afforded him.
“From the user perspective,” the screen contained a list of packages that were about to be installed, a list of packages that were about to be uninstalled, and a message saying that the packages that were about to be uninstalled were essential to the operation of his computer and he should stop now rather than electing to uninstall those packages, along with a prompt that very deliberately requires you to have read that warning in order to proceed.
The real solution would be for Linux to work properly in the first place, and actually install Steam instead of making excuses.
Linux worked properly, apt even worked properly. Pop!_OS’s dependency tree was briefly broken. The package manager then recognized there was something wrong and explicitly told him he was about to uninstall his desktop and that he shouldn’t do it. It wasn’t “destroy my system.” That was me being (generously) 5% hyperbolic. In reality it was a warning that he was about to uninstall several essential packages including his desktop and a recommendation that he shouldn’t do this unless that was what he wanted to do. He was then required to enter a very specific message which was part of that warning, verbatim.
Here’s the thing, no operating system has avoiding pushing out a bad or bugged update periodically. What’s great about Linus’s example is that Pop!_OS pushed out a bad update but the error was limited to one package, and the package manager was smart enough to stop Linus from breaking his system, and told him that it had stopped him from breaking his system. Linus then decided to use another tool that would allow him to break his system. This tool too was smart enough to notice that the package system had broken, and prevented him from breaking his system. He then deliberately bypassed these safeties and uninstalled gdm and xorg.
What’s crucial to note here is that exactly nobody is making excuses for Pop!_OS — they messed up their dependency tree, yes — but also, this is a perfect example of all of these systems working exactly as intended. The package manager was smart enough to stop him from breaking his system even though the dependency tree was mangled, and he then overrode that and chose to break his system anyway. That’s more than can be said for many other operating systems. The tools he was using detected the error on Pop!_OS’s side and saved him
It’s also worth noting that he literally didn’t brick his system, he could have fixed his machine if he’d just installed from the command line the same packages he had just uninstalled. Like, he didn’t actually break his system, he just uninstalled a few packages that were flagged as essential to stop newbies from uninstalling them because it might confuse them if they were uninstalled.
Your assertion that nothing could be done is provably incorrect. Alpine doesn’t have this problem — by design — and it isn’t any less capable than Debian family. It’s a matter of design of tools’ UI, and this part of apt is a poor design.
People don’t accidentally uninstall their OS when installing Steam on other OSes, because everywhere else “install a new user program” and “catastrophically alter the whole system” are separate commands.
Users generally don’t read walls of text. In usability circles this is accepted, and UI designers account for that, instead of wishing they had better users. Users aren’t illiterate, they just value their time, and don’t spend it on reading things that seem to have low value or relevance. The low signal-to-noise ratio of apt’s message and surprising behavior is apt’s problem, not user’s reading problem. And “this is just the way the tool works” is not a justification for the design.
The entire point of Linus’ challenge is that desktop Linux is full of barriers and traps for new users who don’t (yet) know what they’re doing.
I have understand this. The problem I have with some of the complains is they proclaim the one or other way is clear better during the challenge. This is a bit more obvious in the file extension example[0]. I completely understand the experience is frustrating. But the “it is frustrating for me because the system don’t behave like I expect from the knowledge of an other system” don’t mean this system is bad.
Yes systems try to adopt behavior from other systems to make it for users better to adopt. But this has it down side, because you can’t change a bad design after the user get used to it. In this example users get used to ignore errors and warnings and just “click ok”.
Explaining “well, it’s like that because you told it to florb the waggis instead of confeling rolizins, so it’s all your fault” may very well be technically correct, but it doesn’t change the fact that the OS hasn’t worked well for the user
I don’t want to imply they are dump or just don’t want to learn the system. It is frustrating, if a system don’t work the way you expect. I would like to see a discussion after the challenges with an expert explaining why the UI behaves different.
When doing usability evaluations, it’s normal to discount specific solutions offered by frustrated users, but never the problems they face.
There were a few problems here:
Lack of hardware support. Sadly, that’s a broad problem, and difficult to fix.
User needed to download and run a script from GitHub. I think distros could improve here. For a random developer who wrote a script, it’s difficult to distribute the code in a better way. There’s a very high bar for getting something to be an official package, and hardly any other viable alternative. There are several different packaging formats (some tedious to build, some are controversial), a few unofficial package repositories, a few “app stores” for some distros. All this fragmentation is a lot of work and maintenance headache. It makes just dumping a script on GitHub very easy and attractive in comparison. It may not be a failing of any single person or distro, but it is a failing of “Linux desktop” in aggregate.
Browser and file manager did a crappy job by treating HTML with .sh extension as if it was a totally normal thing a user may want to do. The fight about file extensions has been lost in the ‘90s. I’ve been there, tweaking detection by magic bytes in my Directory Opus on AmigaOS, and fiddling with creator codes when copying stuff from classic MacOS. The reality is that file extensions exist, and are meaningful. No normal person stores web pages as “.sh” files.
At the risk of being that condescending Linux user (which would be pretty awful since I’m not really using Linux anymore) my main takeaway from these videos is “don’t use hipster distros”.
Or, okay, hipster distros is where innovation happens. I get it, Gentoo was a hipster distro when I started using it, too. Okay, maybe don’t recommend hipster distros to beginners?
I saw Manjaro mentioned here. I tried Manjaro. It’s not a beginners’ distro. It’s great if you’re a burned out Arch user and you like Arch but you already know the instructions for setting up a display manager by heart and if you have to do it manually again you’re going to go insane. There’s a (small!) group of people who want that, I get it. But why anyone would recommend what is effectively Arch and a rat’s nest of bash scripts held together with duct tape to people who wouldn’t know where to begin debugging a broken Arch installation is beyond me. I mean the installer is so buggy that half the time what it leaves you with is basically a broken Arch installation for heaven’s sake! Its main value proposition is in a bunch of pre-installed software, all of which can be trivially installed on Ubuntu.
I haven’t used Pop!_OS but IMHO a distribution that can’t get Steam right, Steam being one of the most popular Linux packages, is just not a good distribution. It’s particularly unsettling when it’s a distro that’s supposed to have some level of commercial backing, and Steam is one of the most popular packages, so presumably one of the packages that ought to get the most testing. Hell even Debian has instructions that you can just copy-paste off their wiki without breaking anything. And the only reason why they’re “instructions”, not just apt install steam, is that – given their audience – the installation isn’t multilib by default.
There’s certainly a possibility that the problem here was in the proverbial space between the computer and the chair, sure. But if that’s the case again, maybe it’s just time we acknowledged that the way to get “better UX” (whatever that is this year) for Linux is not to ship Gnome with the umpteenth theme that looks like all other theme save for the colors and a few additional extensions. It’s safe to say that every combination of Gnome extensions has already been tried and that’s not where the magic usability dust is at. Until we figure it out, can we just go back to recommending Ubuntu, so that people get the same bad (I suppose?) UX, just on a distribution with more exposure (and, thus, testing) and support channels?
Also, it’s a little unsettling that the Linux community’s approach to usability hasn’t changed since the days of Mandrake, and is still stuck in the mentality of ESR’s ridiculous Aunt Tilly essay. Everyone raves about consistency and looking professional. Meanwhile, the most popular computer OS on the planet ships two control panels and looks like anime, and dragging things to the thrash bin in the second most popular OS on the planet (which has also been looking like anime for a few years now) either deletes them or ejects them, which doesn’t seem to deter anyone from using them. Over here in FOSS land, the UI has been sanitized for consistency and distraction-free visuals to the point where it looks like a frickin’ anime hospital, yet installing Steam (whether through the terminal or the interface it makes no difference – you can click “Yes” just as easily as you can type “Yes”) breaks the system. Well, yeah, this is what you get if you treat usability in terms of “how it looks” and preconceived notions about “how it’s used”, rather than real-life data on how it’s used. It’s not an irredeemable state of affairs, but it will stay unredeemed as long as all the debate is going to be strictly in terms of professional-looking/consistent/beautiful/minimal/distraction-free interfaces and the Unix philosophy.
The issue about Linux distro here is that they didn’t know the differences between them, why that matters, and that Linux isn’t one thing. Without a knowledgeable person to ask what to use, this is how they ended up with these different flavours. They also didn’t know about desktop environments, or how much influence they have over their Linux experience.
It’s unfortunately a hard lens for many technical people to wrap their head around. Heck, we are starting to see people that don’t need to interact with hierarchical file systems anymore. Something natural to everyone here, but becoming a foreign concept to others.
Certainly. My response was mostly in the context of an underlying stream of “Ubuntu hate” that’s pretty prevalent in the circles of the Linux community that also have a lot of advice to give about what the best results for “best Linux distro for gaming” should be. I know I’m going to be obtuse again but if the l33t h4x0rz in the Linux community could just get over themselves and default to Ubuntu whenever someone says “I’ve never touched Linux before, how can I try it?” a lot of these problems, and several distributions that are basically just Ubuntu with a few preinstalled programs and a custom theme, would be gone.
There’s obviously a huge group of people who don’t know and are not interested in knowing what a distribution is, what their desktop environment is, and so on. As the Cheshire Cat would put it, then it doesn’t really matter which one they use, either, so they might as well use the one most people use, since (presumably) their bugs will be the shallowest.
I know this releases all sorts of krakens (BUT MINT WORKS BETTER OUT OF THE BOX AND HAS A VERY CONSISTENT INTERFACE!!!1!!) but the competition is a system whose out-of-the-box experience includes Candy Crush, forced updates, a highly comprehensive range of pre-installed productivity apps of like ten titles, featuring such amazing tools like Paint 3D and a Calculator that made the Win32 calculator one of the most downloaded programs in history, two control panels and a dark theme that features white titlebars. I’m pretty sure any distribution that doesn’t throw you to a command prompt on first boot can top that.
Oh, I totally agree, I was just clarifying that they did some googling to try and find something to use, and it’s how they ended up with this mess of difficulties.
I think you cut to the heart of the matter here. I also think the question they asked initially (what’s the “best” gaming Linux distro) wasn’t well formed for what they actually wanted: what the easiest to configure was. To forestall the “that’s a Linux problem” crowd, that’s an Internet problem, not a Linux problem. If you Google (or ddg or whatever) the wrong question, you’re going to get the wrong answer.
I think we have to resign ourselves to the fact that users generally don’t want to learn how to operate their systems and don’t want meaningful choices. Therefore, many users are not good candidates for a *nix.
Until we figure it out, can we just go back to recommending Ubuntu, so that people get the same bad (I suppose?) UX, just on a distribution with more exposure (and, thus, testing) and support channels?
I wish Ubuntu offered an easier flow for getting a distribution with the right drivers out of the gate. This is what Pop_OS! does (source):
Pop!_OS comes in two versions: Intel/AMD and NVIDIA. This allows us to include different settings and the proprietary NVIDIA driver for NVIDIA systems, ensuring the best performance and use of CUDA tools one command away. On Oryx Pro systems, you can even switch between Intel and Nvidia graphics using a toggle in the top right corner of your screen.
And you need to follow different instructions for AMD graphics
Also if you buy a System76 laptop all the drivers for your computer come set up, no driver manager needed. With Ubuntu you can buy from Dell but not with the same variety of hardware as System76.
I agree that Ubuntu is a good option but I would like to see it improve in these aspects before I would recommend it to a random non-power user who wants to play video games.
I haven’t used Ubuntu in a while, and that page doesn’t help because the instructions look like they haven’t been updated since Ubuntu 12.04, but the way I remember it all you needed to do was go to “Additional Drivers” or whatever it was called, choose the proprietary driver, hit OK and reboot. Has that changed in the meantime? Last time I used a machine with an NVidia card I was running Arch and it was literally just pacman -S nvidia, please tell me Ubuntu didn’t make it more complicated than that!
Also… is the overlap between “people who write CUDA code” and “people who can’t install the proprietary NVidia drivers” really that big? Or is this aimed at people using third-party CUDA applications, who know statistics but suck at computers in general (in which case I get the problem, I’ve taught a lot of engineers of the non-computer kind about Linux and… yeah).
Also if you buy a System76 laptop all the drivers for your computer come set up, no driver manager needed.
If you choose the “Ubuntu LTS” option when ordering, doesn’t it come with the right drivers preloaded? I mean… I get that Pop!_OS is their thing, but shipping a pre-installed but unconfigured OS is not exactly the kind of service I’d expect in that price range.
For a novice user, do you expect them to know before they download the OS whether they have an nVidia or AMD GPU?
I seem to recall that a big part of the motivation for the complex install process for the nVidia drivers was the GPL. The nVidia drivers contain a shim layer that is a derived work of the kernel (it hooks directly into kernel interfaces) and so must be released under a GPL-compatible license and of the proprietary drivers, and the proprietary driver itself, which is first developed on Windows and so is definitely not a derived work of the kernel and can be under any license. The proprietary drivers do not meet the conditions of the GPL and so you cannot distribute the kernel if you bundle it with the drivers. The GPL is not an EULA and so it’s completely fine to download the drivers and link them with your kernel. The GPL explicitly does not restrict use and so this is fine. But the result is something that you cannot redistribute.
FreeBSD and Solaris distributions do not have this problem and so can ship the nVidia drivers if they wish (PC-BSD and Nexenta both did). I wonder how Pop!_OS gets around this. Is it by being small and hoping no one sues them?
From what I can tell, steam isn’t even open source. And while you assert it to be one of the most popular Linux packages, I hadn’t even heard of it until this video came up in all the non-gaming tech news sites despite having used Linux for 25+ years. Was it even a Pop!OS package or were they installing an Ubuntu package on an Ubuntu derivative and assuming it’d just work?
it’s proprietary, yeah, but i just feel like someone has to tell you that there are several orders of magnitude more Steam users than Linux desktop users, and it’s not only a package in Pop!_OS and Ubuntu, it’s a package in Debian and just about every distro for the last decade.
i honestly have gotta applaud you for being productive enough a person to have never heard of Steam. if you look at the install data from popularity-contest, ignoring firmware and libraries (i.e. only looking at user-facing applications), Steam is the third most-installed non-free package on all Debian-based distros, behind unrar and rar. pkgstats.archlinux.de suggests Steam is installed on 36% of Arch Linux installations. Steam is not only an official package on Pop!_OS but one of the most installed packages on desktop Linux overall.
And while you assert it to be one of the most popular Linux packages, I hadn’t even heard of it until this video came up in all the non-gaming tech news sites despite having used Linux for 25+ years
Someone else already pointed out how popular it is but just for the record, any one of us is bound to not have heard about most of the things currently in existence, but that does not make them pop out of existence. Whether you’ve heard of it or not affects its popularity by exactly one person.
Also, lots of useful applications that people want aren’t even open source, and a big selling point of Pop!_OS is that it takes less fiddling to get those working (e.g. NVidia’s proprietary drivers). An exercise similar to this one carried out with, say, Dragora Linux, would’ve probably been a lot shorter.
Was it even a Pop!OS package or were they installing an Ubuntu package on an Ubuntu derivative and assuming it’d just work?
Most of Pop!_OS is Ubuntu packages on an Ubuntu derivative. Does it matter what repo it came from as long as apt was okay installing it?
Edit: to make the second point clear, Pop!_OS is basically Ubuntu with a few custom Gnome packages and a few other repackaged applications, most of the base system, and most of the user-facing applications, are otherwise identical to the Ubuntu packages (they’re probably rebuilt from deb-srcs). No idea if what they tried to install was one of the packages System76 actually repackages, or basically the same as in Ubuntu, but it came from their “official” channel. I.e. they didn’t grab the Ubuntu package off the Internet, dpkg -i it and proceed to wonder why it doesn’t work, they just did apt-get install steam, so yes, it’s a Pop!_OS package.
I mean, I have Big Opinions® on the subject, but my tl;dr is that Linux isn’t Windows, we shouldn’t give false expectations, have our own identity, etc. etc. But….
So he typed in “Yes, do as I say!” and his installation was broken. He claimed later: “the thinks that I did are not entirely ridiculous or unreasonable”. He ignored all warnings and “dictated the computer” “Yes, do as I say!”, how is this not a clear user error[0]?
I mean, the system should refuse to do that. Alpine’s and others refuse to allow the system to enter a boned state. One of the Alpine developers was rightly criticizing Debian for this issue in apt, citing it as one of the reasons why they stopped using Debian. The attention to the problem Linus gave in an embarrassing light was the push finally needed to fix it.
Knowing how Internet guides work, now all guides will say “apt --allow-solver-remove-essential <do dangerous stuff> instead of “Type Yes, do as I say at the prompt”.
I like luke’s perspective that some distros should do different things. I think it’s reasonable for arch to be a ‘power user distro’ that is willing to bork itself. But PopOS is ‘an operating system for STEM and creative professionals’, so it probably should have some safeguards.
That being said I don’t think arch should ever be recommended to a brand new user. Linus shouldn’t even be on arch because 1) there should be better resources for picking a good distro for absolute beginners and 2) PopOS never should have that broken of a steam package in the first place.
That being said I don’t think arch should ever be recommended to a brand new user.
I would qualify this; there are many users for whom arch was there first distro and it went great, but the key thing is these are not your typical computer user; they are people who are technically minded (not necessarily with deep deep knowledge of anything in particular, but they’re probably at least the person their friends ask for help), are up to and interested in learning about the system, and generally have been given some idea of what they’re getting into. That is to say, arch is definitely for “power users,” but that set includes some users who have not actually used Linux before.
For my part, Arch was the first distro that was actually reliable enough for me to do more than play with; I spent a year or so fussing with other stuff while dual booting windows, and Arch is the first one that actually worked well enough for me wipe the windows partition and stay. This was 15 years ago and I haven’t left, though I keep eyeing NixOS these days.
I think at the time folks still had fresh memories of before Linux desktop environments were even a thing, and there was this mentality that the barrier to entry was mostly around user interfaces. People hadn’t really internalized the fact that Linux had caught up decently well even by then (this was KDE 3.x era), but the problem was stuff needed to work better out of the box, and it needed to not break whenever you upgraded to the next release of the distro.
The system had refused to do that. Then the user has told the system to shut up and do as he said. You could argue that this should not be possible, but if you are in the situation where you have fucked up your packages? The way around should be present within the package manager, because without it you need to do the way around without your package manager by deleting files and changing the database file.
First so similar we don’t get the issue, because under Windows there is no package manager accessible for somehow third party software.
Technically not true; there is a Windows package manager and has been for a long time, and that’s the Windows Installer (MSI files). There’s also APIs and supported methods for 3rd-party installers to register an install with the system. What’s historically been missing are official package repositories for installing and updating applications (ala. APT, RPM, etc… repos). That’s slowly changing with the Microsoft Store, winget, and others, but this is an area Linux has long been very far ahead.
So lets assume there is an windows update which removes the wrong file and breaks your system. On the install the update would remove the wrong file and breaks your system.
This is incredibly rare. I won’t claim it hasn’t happened, but more common (while still very rare) is an update which causes a bluescreen on boot or exposes a bug in a critical system process. In either case, we’re talking very rare, but I’d suggest that’s true of Linux too.
Other example the steam installer manage to have a bug with removes some necessary files from your Windows installation. Is there anything Windows protect you from this bug[1]?
Yes, several things, and this is a genuine major contrast to Linux. Off the top of my head:
Window system files cannot be modified by default even with administrator privileges. You can’t simply run an elevated Command Prompt and run the equivalent of rm -rf C:\Windows. That’s because most operating system files are both owned and only writeable by a special account (TrustedInstaller). You can still modify or delete these files, but you have to jump through several hoops. At a minimum, you need administrator privileges (ala. root), and would have to take ownership of the file(s) of interest and subsequently grant yourself the relevant privileges. There are other ways you could gain the relevant access, but the point is it’s not a thing you could do by accident. That’s similarly true for installers, which also would need to take the same approach.
Windows has long had numerous recovery options for when things go pear shaped. Notable ones include Safe Mode and its various permutations (since forever), the ability to uninstall operating system updates (also forever), System Restore (since XP?), System Reset (Windows 10?), and a dedicated recovery partition with a minimal Windows 10 installation to serve as a recovery environment wholly independent of the main operating system. Obviously, none of these are a guarantee for recovery of an appropriately damaged system, but it’s long been the case that Microsoft has implemented numerous recovery/rollback mechanisms.
On Linux, it’s usually limited to one or more previous kernel versions, and that’s about it? Yes, there’s single-user mode, but that just drops you into a root shell, which is wholly unsuitable for non-experts to use.
there is a Windows package manager and has been for a long time, and that’s the Windows Installer (MSI files). There’s also APIs and supported methods for 3rd-party installers to register an install with the system.
I believe we use the same words for different thinks. When I talk about a package manager I mean a system witch provides packages and resolves dependencies. If I understand your comment correct an MSI file installs Software and registers the Software. But there is no way a MSI file claims is it incompatible with version 3.6 of explorer. So that on install the installer solve the dependence graph and present what he need to install and remove.
On Linux, it’s usually limited to one or more previous kernel versions, and that’s about it?
This depends on your system. On debian based OS there are the packages still in the package cache. So you can easy downgrade. There are other options witch allows easy recovery from such bugs. There are most of the time not setup by default and still require some skill to solve your problem.
I believe we use the same words for different thinks. When I talk about a package manager I mean a system witch provides packages and resolves dependencies. If I understand your comment correct an MSI file installs Software and registers the Software. But there is no way a MSI file claims is it incompatible with version 3.6 of explorer. So that on install the installer solve the dependence graph and present what he need to install and remove.
It’s true that MSI (and most competing technologies) generally will not compute and resolve a dependency graph for package installation, but it’s also worth noting this is in part because it’s far less applicable to Windows systems. As the operating system is a single unified system, versus hundreds or even thousands of discrete packages sourced from different projects and maintainers, it’s unusual for an application on Windows to have many dependencies. So in this respect the packaging tools functionality is very much in response to the needs of the underlying platform.
A system with the same sophistication for dependency resolution as the likes of Apt or Yum is simply just not as useful on Windows. Of course, that’s a separate argument from a system which provides a centralised catalogue of software ala. Linux software repositories. That’s an area Windows is very much still playing catch-up on.
This depends on your system. On debian based OS there are the packages still in the package cache. So you can easy downgrade. There are other options witch allows easy recovery from such bugs. There are most of the time not setup by default and still require some skill to solve your problem.
I think we have different definitions of easy here. Typically such an approach would minimally involve various command-line invocations to downgrade the package(s), potentially various dependency packages, and relying on cached package installers which could be removed at any time is less than ideal. Given the upstream repositories usually don’t to my knowledge maintain older package versions, once the cache is cleaned, you’re going to be in trouble. The point I’d make is that if something goes wrong with package installation that breaks your system, on most Linux distributions the facilities to provide automated or simple rollback are fairly minimal.
As the operating system is a single unified system, versus hundreds or even thousands of discrete packages sourced from different projects and maintainers
I would doubt that Windows itself has/is no modular system. The Updater itself must also have some sort of dependency management. FreeBSD as an other unified OS is currently working on a package management system for there base system.
A system with the same sophistication for dependency resolution as the likes of Apt or Yum is simply just not as useful on Windows
Why not? Currently all software ships it dependencies on there own and updater have to implemented in all software. Maybe not with one big graph for all software, but with a graph for each installed program and with a duplicate elimination.
I would doubt that Windows itself has/is no modular system. The Updater itself must also have some sort of dependency management. FreeBSD as an other unified OS is currently working on a package management system for there base system.
You’re right, Windows itself is very modular these days, but the system used for managing those modules and their updates is independent of other installers (inc. MSI). There’s some logic to this, given the updates are distributed as a single cumulative bundle, and MS clearly wanted to design something that met Windows needs, not necessarily broader generalised package dependency handling requirements. The granularity is also probably wrong for a more general solution (it’s excessively granular).
On my system, there’s around ~14,600 discrete components, going off the WinSxS directory.
Why not? Currently all software ships it dependencies on there own and updater have to implemented in all software. Maybe not with one big graph for all software, but with a graph for each installed program and with a duplicate elimination.
Several reasons. One is that most Windows software is predominantly relying on Windows APIs which are already present, so there’s no need to install a multitude of libraries to provide required APIs as is often the case on Linux. They’re already there.
Where there are 3rd-party dependencies, they’re usually a small minority of the application size, and the fact that software on Windows is much more likely to be closed source means it’s harder to standardise on a given version of a library. So if you were to try and unbundle 3rd-party dependencies and have them installed by package manager from a central repository, you’d also need to handle multiple shared library versions in many cases.
That’s a soluble problem, but it’s complex, and it’s unclear if the extra complexity is worth it relative to the problem being solved. I suspect the actual space savings would be minimal for the vast majority of systems.
I’m not saying it’s a bad idea, just that it’s solving a problem I’d argue is far less significant than in *nix land. Again, all of this is independent of centralised package repositories, as we’re starting to see with winget, scoop, choco, etc …
By default Snapper and Btrfs on SUSE Linux Enterprise Server are set up to serve as an “undo tool” for system changes made with YaST and zypper. Before and after running a YaST module or zypper, a snapshot is created.
Excellent. Like ECC RAM, those who are already expert enough to devise ways to do the task are given the tools.
This doesn’t happen on mainstream user- friendliness- oriented distros.
I do wonder about a Nix- based highly usable distribution. All the tools are there to implement these protections, lacking only a general user interface.
That’s pretty cool. I hope it becomes more widely accessible on end-user distributions (I expect SLES is a pretty tiny minority of the overall desktop/laptop Linux userbase).
It was a good old package conflict, wasn’t it? The normal way this happens is if you try to install a package from a foreign distro.
Different distros have different versions of packages, so unless the foreign package’s dependencies happens to line up with every installed package, the only way to install the foreign package is going to be to uninstall everything that depens on a conflicting version of something, which can be quite much.
If so, I wouldn’t call it a “bug”, since that’s a term used on software – the package manager itself, not its input. For user expectations, this means that bugs are fixable, whereas package conflicts (at least of the self inflicted kind) are not. The software can only heuristically refuse to do stupid things.
For those confused by the title, “Linus Gabriel Sebastian is a Canadian YouTube personality. He is the creator and host of the YouTube channel Linus Tech Tips.”[1] Hence LTT on the front page of the linked site.
Scrolling with the mouse wheel in KDE’s volume mixer applet scrolls through both devices and levels of devices.
I wonder what’s the good way to solve that one on the UI level. I really like that KDE allows you to scroll both on the volume bars and on the volume icon itself. (Which other systems don’t do)
I guess at least “don’t change volume until position scrolling has finished” check could work… But that may feel as clunky as the typing-timeout-based touchpad disabling.
I like the list and the challenge. I’m sure it will cause more pressure on some long-standing papercut issues. Apart from some really frustrating parts where Linus knows just enough to overcomplicate things and self-sabotage (like the GitHub part), it’s a fun UX study. I’ve had to start using Windows recently for the first time since w2k and I’m making my own list of WTF issues.
I need to go back and double check what his issue was, but maybe a modifier key (ctrl perhaps) or a middle mouse click to switch between the two options?
It’s basically: if you scroll the list of volume and your mouse goes over a volume slider, you’ll start scroll-adjusting that volume slider instead. You learn to scroll with the mouse on the side of the window quickly.
The application names don’t represent the jobs of those applications (e.g. “Kate”).
I generally don’t like to nitpick, but this is, frankly, some bullshit. There are tons of Windows applications whose names have nothing at all to do with their function. Hell, half of the software Linus shills on the daily - not that this is a bad thing, sponsorship is fine - is basically word soup. It’s especially annoying because the .desktop data for these programs includes their generic descriptions, so if you type “text editor”, you get Kate as a suggestion!
There are some real issues brought up by the LTT folks (although, I’d argue, nothing the free desktop community doesn’t already know), but this and several other points really come off to me as “the status quo is the only viable option until the alternatives are perfect in every way,” which is a classic deflection of any suggestion of change. It’s especially frustrating here because LTT is telling people not to try this as an option, despite these issues being ones that have not hampered any of the various people I’ve given free desktop based machines to over the years.
I wonder if people realise that writing a diatribe in which you “cleverly” come up with “proof” that every bad experience somebody has is “akshoolee really the user’s fault”… doesn’t improve your software?
Generally no. But we are bitten too often by dumb users and problems that need education to fix. And, ultimately, at some point, you need to trust the user.
It’s hard when you need to make it easier for a user, and often comes at the expense of power or elegance; the twin gods of application design don’t like being slighted.
[…] mak[ing] it easier for a user […] often comes at the expense of power or elegance
I, too, had assumed that; but through watching this saga, I am quickly learning to question that assumption. In my opinion, good design hides rare or dangerous functionality, while keeping it available and discoverable for those who truly need it.
I have watched the videos behind this text and I’m a bit frustrated. The most problems they have are either hardware problems or problems because they expect thinks work like on Windows (or believe they work on Windows).
For the hardware the somehow acknowledge that this is more the problem of the vendors then of Linux. It still sounds the most time more like Linux is bad because this super fancy hardware don’t work. Yes I know the problems behind this are complex and as a normal user this is frustrating.
And of course they expect a Windows like behavior, they have used Windows for years. What bugs me is that they claim that the Windows way is the better way without understanding what the problem is. There are two examples for this:
First the Linus broke his Pop!_OS installation while he tried to install steam. This was because the steam package had a dependency problem which could only resolved by removing essential packages. The GUI tells him there was an error with some suggestions what might case the problems and output from apt hidden behind a details button. He reads out loud: “Warning: you trying to remove the following essential packages”. So he googled and found the command line to install steam. So the command prompted him a lot of text and at the end following two lines:
So he typed in “Yes, do as I say!” and his installation was broken. He claimed later: “the thinks that I did are not entirely ridiculous or unreasonable”. He ignored all warnings and “dictated the computer” “Yes, do as I say!”, how is this not a clear user error[0]?
So lets look what would had happen with a similar issue under Windows. First so similar we don’t get the issue, because under Windows there is no package manager accessible for somehow third party software. So lets assume there is an windows update which removes the wrong file and breaks your system. On the install the update would remove the wrong file and breaks your system. Other example the steam installer manage to have a bug with removes some necessary files from your Windows installation. Is there anything Windows protect you from this bug[1]?
It’s late and the other stuff about the file exertion issue I might write tomorrow.
[0] Of course this has also something to do with the spirit of some developers to create popups/warnings/info messages. Which leads users to ignore these messages.
[1] I don’t know, but a few years ago windows installers where just executable which required to run as administrator.
I think the “Windows-like behaviour” in this case is that on Windows Steam works perfectly, you don’t have to think about installing it, there’s no chance it’s going to break your OS, nor will you have to you choose between installing an application you want and having a working OS.
We could imagine a hypothetical Steam bug that somehow wrecks Windows installations, but in reality those don’t exist.
I think those kinds of comparisons don’t work very well, because of the range of options. For the Steam installation issue, on windows you basically have two options: you install it and it works or it doesn’t. In Linux you have the same two options + playing around with various tweaks and installation methods.
If we were going with a typical user windows-like approach, he’d declare it a failure after Steam failed to install from the default source. Going further with other solutions is both a good thing because it’s possible and a bad thing, because newbies get into a situation like a broken desktop. So once you start going past the basics it’s really on the user to understand what they’re doing. Otherwise it’s comparable to “I messed with some windows DLLs / registry trying to get Steam to work, despite warnings and now it doesn’t boot” - but that’s just not something average users do.
On Windows you install Steam and it works. Installing Steam and it not working isn’t really an experience people have with Steam on Windows.
I guess? But the Linux (Pop!_OS?) equivalent of “I messed with some windows DLLs / registry trying to get Steam to work, despite warnings and now it doesn’t boot” is [0] kind of the only experience that was available? It seems like there was no way to install it and have it work, or even install it and have it just not work. The only way to install it broke the OS?
[0] Disclaimer: I didn’t watch the videos, so I’m going off my understanding of the comment I originally replied to
Not just that but you actually do have a lot of tweaks to play around with. They’re not common knowledge because it’s incredibly rare to need it in order to get something like Steam working. You don’t really need them unless you’re developing software for Windows.
I had this “it’s a black box” impression for a long time but 10+ years ago I worked in a Windows-only shopped that did a lot of malware analysis and the like. It’s quite foreign, since it comes from a different heritage, but the array of “power tools” you have on Windows is comparable to that of Linux. The fact that typical users don’t need them as frequently is a good thing, not an evil conspiracy of closed-source vendors to make sure you don’t have control over your hardware.
That’s a bit hard to quantify, but sure they do. Just search for “Steam won’t start” or “steam installer fails” on Reddit or their forums. It’s also common enough for many SEO-spam sites to have listicles for that phrase that are actually steam-specific.
And my point was that this wasn’t there only experience available. The alternative was not to type “yes I’m sure I know what I’m doing” (or whatever the phrase was) when he did not. He went out of his way to break the system after the GUI installer refused to do it. I think you really should watch they fragment for the discussion context.
Presumably because it’s the primary platform they test for.
Of course with a simple installer (copy all files to a directory and add an entry to the windows registry) it’s quite hard to have a bug with breaks your OS. But a simple installer don’t have the features a package management system, i.e. central update mechanism. I don’t want to say package manager are better then the installer way used on Windows[0]. The problem I have with this case it’s not he has clicked some random button and then everything was broken. He has read the error, ignored all warnings and typed the prompt char by char and then wounder why it’s goes wrong.
I don’t say the UI[1] is perfect. The problem I have is this “I ignore all warnings and complain if it goes wrong” mentality[2]. apt is not a program witch bugs you with unnecessary questions or warnings. Install a package only ask for conformation if it does more then only install the requested package. The annoying conformation question is there only if you try to remove essential packages and is designed to give enough hassle to bring the user to question about this commands.
[0] I think systems with package manager are better, but this is not the point of the comment
[1] The error message in the GUI and the handling in the command line
[2] Yes some (or most) users don’t want to understand error messages, but shouldn’t they not stop at the error and look for (professional) help? And no copy paste a command from a random blog post is not help, if you don’t understand the error or the blog post.
The entire point of Linus’ challenge is that desktop Linux is full of barriers and traps for new users who don’t (yet) know what they’re doing.
Explaining “well, it’s like that because you told it to florb the waggis instead of confeling rolizins, so it’s all your fault” may very well be technically correct, but it doesn’t change the fact that the OS hasn’t worked well for the user. “I want to install Steam in 5 minutes without learning about package sudoku solvers, or bricking my computer” is an entirely reasonable use-case.
Web dev community had a reckoning with this, and thinking has changed from “users are too stupid to understand my precious site” to “all my new users know only other sites, so I must meet their expectations”. If Linux wants to get new users it needs to be prepared for users who know only Windows, macOS, or even just Android/iOS.
That’s well and good, but there is a perfectly good fast path for this; install Pop!_OS or Ubuntu on a day where there’s not a bug in the packaging system, which is the vast majority of all days. Yep, it sucks that there was a bug, but that’s simply not going to affect anyone going forward - so why are LTT giving advice based on it?
For every distro D there exists a problem P that is solved in a distro E.
That endless cycle of “then abandon your whole OS and install a completely new one” thing is another annoying problem “Linux desktop” has. It’s not any single distro’s fault, but it’s a pain that users need to deal with.
In my case: I want to use Elementary, but I hosed it trying to update Nvidia drivers. So I was told to switch to Pop!_OS — they do it right. But this one gets stuck seemingly forever when trying to partition my disk, presumably because of the combination of NVMe and SATA that I have. Manjaro worked with my disks, but I’ve run into bugs in its window manager, which wouldn’t be an issue in Elementary. I still use macOS.
Right, I agree that in general this is a problem; we need better ways to integrate the best ideas from multiple projects.
But for the problem stated, which was “I want to install Steam in 5 minutes without learning about package sudoku solvers, or bricking my computer”, Pop!_OS or Ubuntu are the way to go. Your problem is not that; it’s “I want Pantheon and a fast Nvidia card,” and Nvidia have intentionally made that harder than it needs to be.
To be totally clear, I’m under no illusions that every user can simply pick up a free desktop and be on their way, but I think it’s pretty unhelpful to cultivate a discourse which simultaneously says “Users should have a fast path for these common use cases” and “Users should be able to get whichever window manager, packaging system, and customizations they want.” Those are both valuable goals, but the former inherently precludes the latter, especially in a world where some hardware companies, like Nvidia, are actively hostile to free desktop projects.
I switched from windows to Mint a couple of years back for gaming, in a similar experiment to this one (only not so public). I had no issues at all, steam was in the official applications, it installed with one click. Every game that steam claimed worked on linux did work. There were issues with my non-standard multi-monitor set up (there were issues with this in windows too, but they were worse in linux*) but nothing that prevented playing the games. It was only once I enabled the steam beta program which sets steam to attempt to open all games in wine that I had to get down in the weeds with configuring stuff and some things didn’t work. Steam has pretty clear warnings about this when you turn it on though.
I feel like for a tech tips site those guys are pretty non-technical. I never really watched their stuff anyway but now it seems like they should be calling me for help (and I am pretty noob when it comes to linux). This is the biggest criticism for me of this whole experiment. If these guys are an authority on computer tech informing users, they should simply be better at what they do. It is almost like they are running an investment advice channel and going ‘oh no I lost all my money investing in a random startup, guys don’t do the stockmarket it’s broken’. They should be informing people interested in linux what to do and what not to do, and if they are not qualified to do that they should state that and recommend alternatives sources of advice.
*I have a suspicion most of these issues were on the application level not the OS level. Games were probably getting the monitors list from the wrong place. Ironically once I set my monitors up in the way that the developers on both windows and linux were expecting me too, the problems on linux disappeared, but a few small issues persisted on windows.
As a waggis / rolizins engineer, maybe I’m out of touch, but I don’t think “Doing this will cause everything to break. If you want everything to break, then type ‘Yes, please cause my computer to break!’” is quite as obscure a message as anything about florbing and confeling. This required not only a (very rare) bug in the dependency tree but also either a user that deliberately wanted to break his Linux install for YouTube content, or one that is the very embodiment of Dunning-Kruger.
Not only did the dependency tree break, but the package manager was smart enough to recognize that the dependency tree had broken, and stopped him from doing it and told him so. He then went out of his way and used another package management tool to override this and allow him to break his installation anyway. This tool then was also smart enough to recognize the dependency tree was broken, and again warned him what was about to happen. He read this message and copied the text from this warning into a confirmation prompt.
He could just as easily have typed
sudo rm -rf /usr
. He could just as easily have deleted system32 on Windows.The only possible solution that could have prevented him from doing this would be to not tell him his own sudo password and to give him a babysitter to do everything requiring privilege escalation for him so he doesn’t hurt himself, but that solution has logistical issues when you try to scale it up to every desktop Linux user.
You need to have more empathy for the user.
The prompt wasn’t “destroy my system”, it was “do as I say”, and user said to install Steam.
No other operating system is stupid enough to delete itself when you tell it to add a new good application from a reputable publisher. Crappy Windows installers could damage the OS, but Steam doesn’t.
It’s normal for OSes to sound dramatic and ask for extra confirmation when installing software from unusual sources, so the alarming prompt could easily be interpreted as Linux also warning about dangers of “sideloading” a package, which can be dismissed as “I’m not installing malware, just Steam, so it’s fine”.
From user perspective the screen contained “Install Steam, wall of technogibberish user didn’t ask nor care for, type ‘Yes, do as I say!’”. The system frequently requires to type weird commands, so it requiring to type one more weird command wasn’t out of ordinary.
The real solution would be for Linux to work properly in the first place, and actually install Steam instead of making excuses. Linux is just an awful target for 3rd party applications, and even the other Linus knows this.
No other operating system is willing to give the user the ability to break the desktop environment intentionally (though I recall a lot of Windows bugs in the past that did this unintentionally). One of the fundamental problems Linux faces is that most users don’t actually want as much power as running as root gives you. They’ll say the do, but they really don’t, and their operating system choice generally reflects that.
This is pretty obviously because it’s axiomatically impossible for the OS to actually tell if something the user does with sufficient privileges will break something (inter alia, you’d have to be able to solve the Halting Problem to do this). In this case, the package manager was obviously correct, which should be applauded. There are two obvious responses to this (maybe there are non-obvious ones I’m missing as well): restrict the user’s ability to do things to actions with a low likelihood of breaking the OS or trust the user to make a decision and accept the consequences after a warning.
Broadly, Windows, the MacOSs, and the mobile operating systems have been moving towards restricting the user’s ability to do risky things (which also includes a lot of things proficient system operators want to do). That seems to be in response to consumer demand but I don’t think that we should enshrine the desires of the bottom 60% of users (in terms of system operation competence) as the standard to which all systems should be designed. This is not related to an “it should just work” attitude towards 3rd party software as there’s generally been a significant decrease in things like OS API stability over the past two decades (e.g. this rant of Spolsky’s). Users just think that anything they want to use should “just work” while anything they don’t care about should be marginally supported to minimize cost: the problem is that many people want different things to work.
On the other hand, some users don’t want the operating system reasoning for them (at least some of the time). I don’t want an operating system “smart” enough to prevent me from doing something stupid on my project boxes or in a VM I’m playing with especially if it’s just something that looks stupid to the OS but I’m doing for some (moderately good) reason.
You’re boxing this into a dichotomy of restricting user or not, but this isn’t the issue here.
The issue is not about power, but about usability. You don’t need to block all failure paths. You need to communicate clearly and steer users towards success paths, so they never even come close to the dangerous commands by accident.
I wouldn’t say this is really about power, so much as control though I tend to be a bit of a pedant about defining “power”.
I think the communication here was reasonably good, though it could be improved. I think the real mistake Linus made was in choice of distribution. That is a real problem in the Linux community (and I think the one we should be focused upon here). I think the opportunity to improve communication here is marginal at best.
I do. I’m just saying that there’s nothing anyone could have done to prevent this except disallow even the root user from uninstalling xorg, and even then he could have just manually removed crucial files if he felt like it. OS maintainers are going to make mistakes occasionally. “Just don’t make mistakes ever” isn’t a viable strategy for avoiding things like this. What is a viable strategy is to build tools that detect and correct for errors like the one in Pop!_OS’s dependency tree. And that’s exactly what happened. He just disregarded the numerous protections from this bug that his OS afforded him.
“From the user perspective,” the screen contained a list of packages that were about to be installed, a list of packages that were about to be uninstalled, and a message saying that the packages that were about to be uninstalled were essential to the operation of his computer and he should stop now rather than electing to uninstall those packages, along with a prompt that very deliberately requires you to have read that warning in order to proceed.
Linux worked properly, apt even worked properly. Pop!_OS’s dependency tree was briefly broken. The package manager then recognized there was something wrong and explicitly told him he was about to uninstall his desktop and that he shouldn’t do it. It wasn’t “destroy my system.” That was me being (generously) 5% hyperbolic. In reality it was a warning that he was about to uninstall several essential packages including his desktop and a recommendation that he shouldn’t do this unless that was what he wanted to do. He was then required to enter a very specific message which was part of that warning, verbatim.
Here’s the thing, no operating system has avoiding pushing out a bad or bugged update periodically. What’s great about Linus’s example is that Pop!_OS pushed out a bad update but the error was limited to one package, and the package manager was smart enough to stop Linus from breaking his system, and told him that it had stopped him from breaking his system. Linus then decided to use another tool that would allow him to break his system. This tool too was smart enough to notice that the package system had broken, and prevented him from breaking his system. He then deliberately bypassed these safeties and uninstalled gdm and xorg.
What’s crucial to note here is that exactly nobody is making excuses for Pop!_OS — they messed up their dependency tree, yes — but also, this is a perfect example of all of these systems working exactly as intended. The package manager was smart enough to stop him from breaking his system even though the dependency tree was mangled, and he then overrode that and chose to break his system anyway. That’s more than can be said for many other operating systems. The tools he was using detected the error on Pop!_OS’s side and saved him
It’s also worth noting that he literally didn’t brick his system, he could have fixed his machine if he’d just installed from the command line the same packages he had just uninstalled. Like, he didn’t actually break his system, he just uninstalled a few packages that were flagged as essential to stop newbies from uninstalling them because it might confuse them if they were uninstalled.
Your assertion that nothing could be done is provably incorrect. Alpine doesn’t have this problem — by design — and it isn’t any less capable than Debian family. It’s a matter of design of tools’ UI, and this part of
apt
is a poor design.People don’t accidentally uninstall their OS when installing Steam on other OSes, because everywhere else “install a new user program” and “catastrophically alter the whole system” are separate commands.
Users generally don’t read walls of text. In usability circles this is accepted, and UI designers account for that, instead of wishing they had better users. Users aren’t illiterate, they just value their time, and don’t spend it on reading things that seem to have low value or relevance. The low signal-to-noise ratio of apt’s message and surprising behavior is apt’s problem, not user’s reading problem. And “this is just the way the tool works” is not a justification for the design.
I have understand this. The problem I have with some of the complains is they proclaim the one or other way is clear better during the challenge. This is a bit more obvious in the file extension example[0]. I completely understand the experience is frustrating. But the “it is frustrating for me because the system don’t behave like I expect from the knowledge of an other system” don’t mean this system is bad.
Yes systems try to adopt behavior from other systems to make it for users better to adopt. But this has it down side, because you can’t change a bad design after the user get used to it. In this example users get used to ignore errors and warnings and just “click ok”.
I don’t want to imply they are dump or just don’t want to learn the system. It is frustrating, if a system don’t work the way you expect. I would like to see a discussion after the challenges with an expert explaining why the UI behaves different.
[0] Which I don’t write today, it’s late again
When doing usability evaluations, it’s normal to discount specific solutions offered by frustrated users, but never the problems they face.
There were a few problems here:
Lack of hardware support. Sadly, that’s a broad problem, and difficult to fix.
User needed to download and run a script from GitHub. I think distros could improve here. For a random developer who wrote a script, it’s difficult to distribute the code in a better way. There’s a very high bar for getting something to be an official package, and hardly any other viable alternative. There are several different packaging formats (some tedious to build, some are controversial), a few unofficial package repositories, a few “app stores” for some distros. All this fragmentation is a lot of work and maintenance headache. It makes just dumping a script on GitHub very easy and attractive in comparison. It may not be a failing of any single person or distro, but it is a failing of “Linux desktop” in aggregate.
Browser and file manager did a crappy job by treating HTML with .sh extension as if it was a totally normal thing a user may want to do. The fight about file extensions has been lost in the ‘90s. I’ve been there, tweaking detection by magic bytes in my Directory Opus on AmigaOS, and fiddling with creator codes when copying stuff from classic MacOS. The reality is that file extensions exist, and are meaningful. No normal person stores web pages as “.sh” files.
At the risk of being that condescending Linux user (which would be pretty awful since I’m not really using Linux anymore) my main takeaway from these videos is “don’t use hipster distros”.
Or, okay, hipster distros is where innovation happens. I get it, Gentoo was a hipster distro when I started using it, too. Okay, maybe don’t recommend hipster distros to beginners?
I saw Manjaro mentioned here. I tried Manjaro. It’s not a beginners’ distro. It’s great if you’re a burned out Arch user and you like Arch but you already know the instructions for setting up a display manager by heart and if you have to do it manually again you’re going to go insane. There’s a (small!) group of people who want that, I get it. But why anyone would recommend what is effectively Arch and a rat’s nest of bash scripts held together with duct tape to people who wouldn’t know where to begin debugging a broken Arch installation is beyond me. I mean the installer is so buggy that half the time what it leaves you with is basically a broken Arch installation for heaven’s sake! Its main value proposition is in a bunch of pre-installed software, all of which can be trivially installed on Ubuntu.
I haven’t used Pop!_OS but IMHO a distribution that can’t get Steam right, Steam being one of the most popular Linux packages, is just not a good distribution. It’s particularly unsettling when it’s a distro that’s supposed to have some level of commercial backing, and Steam is one of the most popular packages, so presumably one of the packages that ought to get the most testing. Hell even Debian has instructions that you can just copy-paste off their wiki without breaking anything. And the only reason why they’re “instructions”, not just
apt install steam
, is that – given their audience – the installation isn’t multilib by default.There’s certainly a possibility that the problem here was in the proverbial space between the computer and the chair, sure. But if that’s the case again, maybe it’s just time we acknowledged that the way to get “better UX” (whatever that is this year) for Linux is not to ship Gnome with the umpteenth theme that looks like all other theme save for the colors and a few additional extensions. It’s safe to say that every combination of Gnome extensions has already been tried and that’s not where the magic usability dust is at. Until we figure it out, can we just go back to recommending Ubuntu, so that people get the same bad (I suppose?) UX, just on a distribution with more exposure (and, thus, testing) and support channels?
Also, it’s a little unsettling that the Linux community’s approach to usability hasn’t changed since the days of Mandrake, and is still stuck in the mentality of ESR’s ridiculous Aunt Tilly essay. Everyone raves about consistency and looking professional. Meanwhile, the most popular computer OS on the planet ships two control panels and looks like anime, and dragging things to the thrash bin in the second most popular OS on the planet (which has also been looking like anime for a few years now) either deletes them or ejects them, which doesn’t seem to deter anyone from using them. Over here in FOSS land, the UI has been sanitized for consistency and distraction-free visuals to the point where it looks like a frickin’ anime hospital, yet installing Steam (whether through the terminal or the interface it makes no difference – you can click “Yes” just as easily as you can type “Yes”) breaks the system. Well, yeah, this is what you get if you treat usability in terms of “how it looks” and preconceived notions about “how it’s used”, rather than real-life data on how it’s used. It’s not an irredeemable state of affairs, but it will stay unredeemed as long as all the debate is going to be strictly in terms of professional-looking/consistent/beautiful/minimal/distraction-free interfaces and the Unix philosophy.
The issue about Linux distro here is that they didn’t know the differences between them, why that matters, and that Linux isn’t one thing. Without a knowledgeable person to ask what to use, this is how they ended up with these different flavours. They also didn’t know about desktop environments, or how much influence they have over their Linux experience.
It’s unfortunately a hard lens for many technical people to wrap their head around. Heck, we are starting to see people that don’t need to interact with hierarchical file systems anymore. Something natural to everyone here, but becoming a foreign concept to others.
Certainly. My response was mostly in the context of an underlying stream of “Ubuntu hate” that’s pretty prevalent in the circles of the Linux community that also have a lot of advice to give about what the best results for “best Linux distro for gaming” should be. I know I’m going to be obtuse again but if the l33t h4x0rz in the Linux community could just get over themselves and default to Ubuntu whenever someone says “I’ve never touched Linux before, how can I try it?” a lot of these problems, and several distributions that are basically just Ubuntu with a few preinstalled programs and a custom theme, would be gone.
There’s obviously a huge group of people who don’t know and are not interested in knowing what a distribution is, what their desktop environment is, and so on. As the Cheshire Cat would put it, then it doesn’t really matter which one they use, either, so they might as well use the one most people use, since (presumably) their bugs will be the shallowest.
I know this releases all sorts of krakens (BUT MINT WORKS BETTER OUT OF THE BOX AND HAS A VERY CONSISTENT INTERFACE!!!1!!) but the competition is a system whose out-of-the-box experience includes Candy Crush, forced updates, a highly comprehensive range of pre-installed productivity apps of like ten titles, featuring such amazing tools like Paint 3D and a Calculator that made the Win32 calculator one of the most downloaded programs in history, two control panels and a dark theme that features white titlebars. I’m pretty sure any distribution that doesn’t throw you to a command prompt on first boot can top that.
Oh, I totally agree, I was just clarifying that they did some googling to try and find something to use, and it’s how they ended up with this mess of difficulties.
I think you cut to the heart of the matter here. I also think the question they asked initially (what’s the “best” gaming Linux distro) wasn’t well formed for what they actually wanted: what the easiest to configure was. To forestall the “that’s a Linux problem” crowd, that’s an Internet problem, not a Linux problem. If you Google (or ddg or whatever) the wrong question, you’re going to get the wrong answer.
I think we have to resign ourselves to the fact that users generally don’t want to learn how to operate their systems and don’t want meaningful choices. Therefore, many users are not good candidates for a *nix.
I wish Ubuntu offered an easier flow for getting a distribution with the right drivers out of the gate. This is what Pop_OS! does (source):
IMO this is superior to in Ubuntu where you need to follow complex instructions to get NVIDIA proprietary drivers: https://help.ubuntu.com/community/BinaryDriverHowto/Nvidia
And you need to follow different instructions for AMD graphics
Also if you buy a System76 laptop all the drivers for your computer come set up, no driver manager needed. With Ubuntu you can buy from Dell but not with the same variety of hardware as System76.
I agree that Ubuntu is a good option but I would like to see it improve in these aspects before I would recommend it to a random non-power user who wants to play video games.
I haven’t used Ubuntu in a while, and that page doesn’t help because the instructions look like they haven’t been updated since Ubuntu 12.04, but the way I remember it all you needed to do was go to “Additional Drivers” or whatever it was called, choose the proprietary driver, hit OK and reboot. Has that changed in the meantime? Last time I used a machine with an NVidia card I was running Arch and it was literally just
pacman -S nvidia
, please tell me Ubuntu didn’t make it more complicated than that!Also… is the overlap between “people who write CUDA code” and “people who can’t install the proprietary NVidia drivers” really that big? Or is this aimed at people using third-party CUDA applications, who know statistics but suck at computers in general (in which case I get the problem, I’ve taught a lot of engineers of the non-computer kind about Linux and… yeah).
If you choose the “Ubuntu LTS” option when ordering, doesn’t it come with the right drivers preloaded? I mean… I get that Pop!_OS is their thing, but shipping a pre-installed but unconfigured OS is not exactly the kind of service I’d expect in that price range.
For a novice user, do you expect them to know before they download the OS whether they have an nVidia or AMD GPU?
I seem to recall that a big part of the motivation for the complex install process for the nVidia drivers was the GPL. The nVidia drivers contain a shim layer that is a derived work of the kernel (it hooks directly into kernel interfaces) and so must be released under a GPL-compatible license and of the proprietary drivers, and the proprietary driver itself, which is first developed on Windows and so is definitely not a derived work of the kernel and can be under any license. The proprietary drivers do not meet the conditions of the GPL and so you cannot distribute the kernel if you bundle it with the drivers. The GPL is not an EULA and so it’s completely fine to download the drivers and link them with your kernel. The GPL explicitly does not restrict use and so this is fine. But the result is something that you cannot redistribute.
FreeBSD and Solaris distributions do not have this problem and so can ship the nVidia drivers if they wish (PC-BSD and Nexenta both did). I wonder how Pop!_OS gets around this. Is it by being small and hoping no one sues them?
From what I can tell, steam isn’t even open source. And while you assert it to be one of the most popular Linux packages, I hadn’t even heard of it until this video came up in all the non-gaming tech news sites despite having used Linux for 25+ years. Was it even a Pop!OS package or were they installing an Ubuntu package on an Ubuntu derivative and assuming it’d just work?
it’s proprietary, yeah, but i just feel like someone has to tell you that there are several orders of magnitude more Steam users than Linux desktop users, and it’s not only a package in Pop!_OS and Ubuntu, it’s a package in Debian and just about every distro for the last decade.
i honestly have gotta applaud you for being productive enough a person to have never heard of Steam. if you look at the install data from
popularity-contest
, ignoring firmware and libraries (i.e. only looking at user-facing applications), Steam is the third most-installed non-free package on all Debian-based distros, behind unrar and rar. pkgstats.archlinux.de suggests Steam is installed on 36% of Arch Linux installations. Steam is not only an official package on Pop!_OS but one of the most installed packages on desktop Linux overall.Someone else already pointed out how popular it is but just for the record, any one of us is bound to not have heard about most of the things currently in existence, but that does not make them pop out of existence. Whether you’ve heard of it or not affects its popularity by exactly one person.
Also, lots of useful applications that people want aren’t even open source, and a big selling point of Pop!_OS is that it takes less fiddling to get those working (e.g. NVidia’s proprietary drivers). An exercise similar to this one carried out with, say, Dragora Linux, would’ve probably been a lot shorter.
Most of Pop!_OS is Ubuntu packages on an Ubuntu derivative. Does it matter what repo it came from as long as
apt
was okay installing it?Edit: to make the second point clear, Pop!_OS is basically Ubuntu with a few custom Gnome packages and a few other repackaged applications, most of the base system, and most of the user-facing applications, are otherwise identical to the Ubuntu packages (they’re probably rebuilt from deb-srcs). No idea if what they tried to install was one of the packages System76 actually repackages, or basically the same as in Ubuntu, but it came from their “official” channel. I.e. they didn’t grab the Ubuntu package off the Internet,
dpkg -i
it and proceed to wonder why it doesn’t work, they just didapt-get install steam
, so yes, it’s a Pop!_OS package.I mean, I have Big Opinions® on the subject, but my tl;dr is that Linux isn’t Windows, we shouldn’t give false expectations, have our own identity, etc. etc. But….
I mean, the system should refuse to do that. Alpine’s and others refuse to allow the system to enter a boned state. One of the Alpine developers was rightly criticizing Debian for this issue in apt, citing it as one of the reasons why they stopped using Debian. The attention to the problem Linus gave in an embarrassing light was the push finally needed to fix it.
Knowing how Internet guides work, now all guides will say “
apt --allow-solver-remove-essential <do dangerous stuff>
instead of “Type Yes, do as I say at the prompt”.I like luke’s perspective that some distros should do different things. I think it’s reasonable for arch to be a ‘power user distro’ that is willing to bork itself. But PopOS is ‘an operating system for STEM and creative professionals’, so it probably should have some safeguards.
That being said I don’t think arch should ever be recommended to a brand new user. Linus shouldn’t even be on arch because 1) there should be better resources for picking a good distro for absolute beginners and 2) PopOS never should have that broken of a steam package in the first place.
I would qualify this; there are many users for whom arch was there first distro and it went great, but the key thing is these are not your typical computer user; they are people who are technically minded (not necessarily with deep deep knowledge of anything in particular, but they’re probably at least the person their friends ask for help), are up to and interested in learning about the system, and generally have been given some idea of what they’re getting into. That is to say, arch is definitely for “power users,” but that set includes some users who have not actually used Linux before.
For my part, Arch was the first distro that was actually reliable enough for me to do more than play with; I spent a year or so fussing with other stuff while dual booting windows, and Arch is the first one that actually worked well enough for me wipe the windows partition and stay. This was 15 years ago and I haven’t left, though I keep eyeing NixOS these days.
I think at the time folks still had fresh memories of before Linux desktop environments were even a thing, and there was this mentality that the barrier to entry was mostly around user interfaces. People hadn’t really internalized the fact that Linux had caught up decently well even by then (this was KDE 3.x era), but the problem was stuff needed to work better out of the box, and it needed to not break whenever you upgraded to the next release of the distro.
The system had refused to do that. Then the user has told the system to shut up and do as he said. You could argue that this should not be possible, but if you are in the situation where you have fucked up your packages? The way around should be present within the package manager, because without it you need to do the way around without your package manager by deleting files and changing the database file.
To answer some of your questions:
Technically not true; there is a Windows package manager and has been for a long time, and that’s the Windows Installer (MSI files). There’s also APIs and supported methods for 3rd-party installers to register an install with the system. What’s historically been missing are official package repositories for installing and updating applications (ala. APT, RPM, etc… repos). That’s slowly changing with the Microsoft Store,
winget
, and others, but this is an area Linux has long been very far ahead.This is incredibly rare. I won’t claim it hasn’t happened, but more common (while still very rare) is an update which causes a bluescreen on boot or exposes a bug in a critical system process. In either case, we’re talking very rare, but I’d suggest that’s true of Linux too.
Yes, several things, and this is a genuine major contrast to Linux. Off the top of my head:
Window system files cannot be modified by default even with administrator privileges. You can’t simply run an elevated Command Prompt and run the equivalent of
rm -rf C:\Windows
. That’s because most operating system files are both owned and only writeable by a special account (TrustedInstaller
). You can still modify or delete these files, but you have to jump through several hoops. At a minimum, you need administrator privileges (ala.root
), and would have to take ownership of the file(s) of interest and subsequently grant yourself the relevant privileges. There are other ways you could gain the relevant access, but the point is it’s not a thing you could do by accident. That’s similarly true for installers, which also would need to take the same approach.Windows has long had numerous recovery options for when things go pear shaped. Notable ones include Safe Mode and its various permutations (since forever), the ability to uninstall operating system updates (also forever), System Restore (since XP?), System Reset (Windows 10?), and a dedicated recovery partition with a minimal Windows 10 installation to serve as a recovery environment wholly independent of the main operating system. Obviously, none of these are a guarantee for recovery of an appropriately damaged system, but it’s long been the case that Microsoft has implemented numerous recovery/rollback mechanisms.
On Linux, it’s usually limited to one or more previous kernel versions, and that’s about it? Yes, there’s single-user mode, but that just drops you into a root shell, which is wholly unsuitable for non-experts to use.
I believe we use the same words for different thinks. When I talk about a package manager I mean a system witch provides packages and resolves dependencies. If I understand your comment correct an MSI file installs Software and registers the Software. But there is no way a MSI file claims is it incompatible with version 3.6 of explorer. So that on install the installer solve the dependence graph and present what he need to install and remove.
This depends on your system. On debian based OS there are the packages still in the package cache. So you can easy downgrade. There are other options witch allows easy recovery from such bugs. There are most of the time not setup by default and still require some skill to solve your problem.
It’s true that MSI (and most competing technologies) generally will not compute and resolve a dependency graph for package installation, but it’s also worth noting this is in part because it’s far less applicable to Windows systems. As the operating system is a single unified system, versus hundreds or even thousands of discrete packages sourced from different projects and maintainers, it’s unusual for an application on Windows to have many dependencies. So in this respect the packaging tools functionality is very much in response to the needs of the underlying platform.
A system with the same sophistication for dependency resolution as the likes of Apt or Yum is simply just not as useful on Windows. Of course, that’s a separate argument from a system which provides a centralised catalogue of software ala. Linux software repositories. That’s an area Windows is very much still playing catch-up on.
I think we have different definitions of easy here. Typically such an approach would minimally involve various command-line invocations to downgrade the package(s), potentially various dependency packages, and relying on cached package installers which could be removed at any time is less than ideal. Given the upstream repositories usually don’t to my knowledge maintain older package versions, once the cache is cleaned, you’re going to be in trouble. The point I’d make is that if something goes wrong with package installation that breaks your system, on most Linux distributions the facilities to provide automated or simple rollback are fairly minimal.
I would doubt that Windows itself has/is no modular system. The Updater itself must also have some sort of dependency management. FreeBSD as an other unified OS is currently working on a package management system for there base system.
Why not? Currently all software ships it dependencies on there own and updater have to implemented in all software. Maybe not with one big graph for all software, but with a graph for each installed program and with a duplicate elimination.
You’re right, Windows itself is very modular these days, but the system used for managing those modules and their updates is independent of other installers (inc. MSI). There’s some logic to this, given the updates are distributed as a single cumulative bundle, and MS clearly wanted to design something that met Windows needs, not necessarily broader generalised package dependency handling requirements. The granularity is also probably wrong for a more general solution (it’s excessively granular).
On my system, there’s around ~14,600 discrete components, going off the
WinSxS
directory.Several reasons. One is that most Windows software is predominantly relying on Windows APIs which are already present, so there’s no need to install a multitude of libraries to provide required APIs as is often the case on Linux. They’re already there.
Where there are 3rd-party dependencies, they’re usually a small minority of the application size, and the fact that software on Windows is much more likely to be closed source means it’s harder to standardise on a given version of a library. So if you were to try and unbundle 3rd-party dependencies and have them installed by package manager from a central repository, you’d also need to handle multiple shared library versions in many cases.
That’s a soluble problem, but it’s complex, and it’s unclear if the extra complexity is worth it relative to the problem being solved. I suspect the actual space savings would be minimal for the vast majority of systems.
I’m not saying it’s a bad idea, just that it’s solving a problem I’d argue is far less significant than in *nix land. Again, all of this is independent of centralised package repositories, as we’re starting to see with
winget
,scoop
,choco
, etc …https://documentation.suse.com/sles/11-SP4/html/SLES-all/cha-snapper.html
Excellent. Like ECC RAM, those who are already expert enough to devise ways to do the task are given the tools.
This doesn’t happen on mainstream user- friendliness- oriented distros.
I do wonder about a Nix- based highly usable distribution. All the tools are there to implement these protections, lacking only a general user interface.
I think that’s an unfair summary. Implementing this properly takes time and few distros have even started to default to filesystems where this is possible. It’s coming desktops too: https://fedoraproject.org/wiki/Changes/BtrfsWithFullSystemSnapshots
Of course it’s coming.
I still think the criticisms are valid and help drive these technologies arriving for the common user.
That’s pretty cool. I hope it becomes more widely accessible on end-user distributions (I expect SLES is a pretty tiny minority of the overall desktop/laptop Linux userbase).
It was a good old package conflict, wasn’t it? The normal way this happens is if you try to install a package from a foreign distro.
Different distros have different versions of packages, so unless the foreign package’s dependencies happens to line up with every installed package, the only way to install the foreign package is going to be to uninstall everything that depens on a conflicting version of something, which can be quite much.
If so, I wouldn’t call it a “bug”, since that’s a term used on software – the package manager itself, not its input. For user expectations, this means that bugs are fixable, whereas package conflicts (at least of the self inflicted kind) are not. The software can only heuristically refuse to do stupid things.
For those confused by the title, “Linus Gabriel Sebastian is a Canadian YouTube personality. He is the creator and host of the YouTube channel Linus Tech Tips.”[1] Hence LTT on the front page of the linked site.
[1] https://en.wikipedia.org/wiki/Linus_Sebastian
Which one? I don’t drop my electronics.
I wonder what’s the good way to solve that one on the UI level. I really like that KDE allows you to scroll both on the volume bars and on the volume icon itself. (Which other systems don’t do)
I guess at least “don’t change volume until position scrolling has finished” check could work… But that may feel as clunky as the typing-timeout-based touchpad disabling.
I like the list and the challenge. I’m sure it will cause more pressure on some long-standing papercut issues. Apart from some really frustrating parts where Linus knows just enough to overcomplicate things and self-sabotage (like the GitHub part), it’s a fun UX study. I’ve had to start using Windows recently for the first time since w2k and I’m making my own list of WTF issues.
I need to go back and double check what his issue was, but maybe a modifier key (ctrl perhaps) or a middle mouse click to switch between the two options?
It’s basically: if you scroll the list of volume and your mouse goes over a volume slider, you’ll start scroll-adjusting that volume slider instead. You learn to scroll with the mouse on the side of the window quickly.
https://bugs.kde.org/show_bug.cgi?id=385270
yes, this is something that I really miss on Windows. to whoever thought of adding this feature to KDE – thank you!
I generally don’t like to nitpick, but this is, frankly, some bullshit. There are tons of Windows applications whose names have nothing at all to do with their function. Hell, half of the software Linus shills on the daily - not that this is a bad thing, sponsorship is fine - is basically word soup. It’s especially annoying because the .desktop data for these programs includes their generic descriptions, so if you type “text editor”, you get Kate as a suggestion!
There are some real issues brought up by the LTT folks (although, I’d argue, nothing the free desktop community doesn’t already know), but this and several other points really come off to me as “the status quo is the only viable option until the alternatives are perfect in every way,” which is a classic deflection of any suggestion of change. It’s especially frustrating here because LTT is telling people not to try this as an option, despite these issues being ones that have not hampered any of the various people I’ve given free desktop based machines to over the years.
I wonder if people realise that writing a diatribe in which you “cleverly” come up with “proof” that every bad experience somebody has is “akshoolee really the user’s fault”… doesn’t improve your software?
Generally no. But we are bitten too often by dumb users and problems that need education to fix. And, ultimately, at some point, you need to trust the user.
It’s hard when you need to make it easier for a user, and often comes at the expense of power or elegance; the twin gods of application design don’t like being slighted.
I, too, had assumed that; but through watching this saga, I am quickly learning to question that assumption. In my opinion, good design hides rare or dangerous functionality, while keeping it available and discoverable for those who truly need it.
What is this about? There’s no link to anything?
See https://old.reddit.com/r/linux/comments/r9gow0/a_list_of_issues_linus_and_luke_experienced/hnbzhi1/
can y’all do me a favor and specify which Linus you’re talking about?