1. 21
  1.  

  2. 7

    I don’t really mind for me personally, but I could see someone with an extremely low internet quota, on cellular or a holiday wanting to choose when an update is downloaded at least. That would annoy me if I went to another country with my laptop, got a new mobile connection and blew it in the first hour because something wanted to update.

    Windows does it as well and does actually annoy me. I don’t mind them downloading the update on my wired connection and even installing as much as possible, but don’t show a dialog that say, Reboot now, in 10 mins, 1 hour, 4 hours. I often delay it once but am not there to delay it again. It’s my computer, let me control it, why isn’t there a “Apply updates during my next reboot”? Why can’t that just be the default behaviour and we can get rid of the dialog? Windows could even wait and see if I reboot in the next 3 days say and then suggest that it is time to reboot, they could even show me the uptime and shame me into rebooting my poor, poor consumer hardware!

    1. 6

      I’ve also had this problem, so I have this mess of running around trying to turn off everything that auto-updates without asking, but then also making sure I don’t forget to occasionally update it. On phones themselves the problem is solved well enough: Both iOS and Android can time their own updates, of both the OS and apps, to download over wifi and not blow through your data quota. But laptops are still assuming that all data connections are equal and unmetered, and don’t have a way for you to mark some wifi connections as not suitable for unnecessary data transfers.

      I could imagine Apple doing something about this in the future within their own ecosystem. For people who use both a Mac laptop and an iPhone, they’ve been moving to smarter handling of the connection via their “Continuity” features (e.g. phone-call hand-off between devices), so they could conceivably also apply a more unified data policy.

      1. 5

        +1 for macs knowing when they are tethered and not downloading software updates over 3G data. I recently had this happen to me. Now I turn off the tethering as soon as I’ve finished, but this doesn’t stop the machine using extra bandwidth while I use the connection.

        1. 3

          Windows lets you specify a connection is metered as well, though it doesn’t automatically know when it’s tethered to an iPhone.

        2. 6

          Chrome in particular uses efficient delta-encoding to reduce the size of updates into the tens of kilobytes. Meanwhile, “it takes 87 requests and 7MB of data transfer to read The Verge’s 1600-word article on why the mobile web sucks.” And, Chrome’s autoupdates are silent, never presenting the user with an annoying Windows nag dialog telling them to reboot.

          So neither of these complaints really apply to Chrome. Indeed, Chrome is really the model of how to ensure that users are protected against security flaws from obsolete software: keep the software up-to-date without getting in their way.

        3. 7

          This is also why I hate web apps usually. Developers just update them whether I like it or not. I wish web apps would have versions like normal apps.

          But now it seems even normal apps like Chrome don’t give you that option anymore. Fuck the user, right?

          Maybe this is some weird attempt to make sure you know that you have no control over your machine. Give up petty human!

          1. 5

            Well, look at it this way…

            The compact, kinda, was “Developers will fix bugs and not write terrible software”, and for users “Users will promptly update when given free updates”.

            Funny thing–developers released bad software, and users pathologically failed to update things.

            Developers have gotten a little better about software, and have removed the option for users to screw up and not update.

            Users–especially the casual users we all cultivate online–have shown themselves unwilling to do the right thing, and it’s cost many man-years of development time. They’re getting better than they deserve.

            EDIT:

            To wit, look how far back the entire web development world has been because of legacy users running IE6 or something, and being unwilling to update because “it might break something”.

            The zen of software is to update frequently if you need to fix bugs/make compatibility. Not doing this is a Bad Idea.

            1. 5

              Developers have gotten a little better about software

              Eh, part of the problem is that I doubt this is true. Developers really do break things when they update, and not having control over when to update is as a result very problematic, especially when using professional software. If developers never broke things when updating, gave suitable advanced warning before deprecating features, etc., it wouldn’t be a problem, but that’s not how things work. Even SaaS stuff I’ve paid money for is typically not very good, and not very consistent. With versioned software that has oddities, you can at least develop workarounds for the common issues, and when working in a team you can all get on the same workflow and get on to real work. But with odd webapps that change every 2 weeks you can’t even develop usable workarounds!

              Android apps are also Very Bad Quality on average, and the constant updating makes them even worse, because it often actually breaks things that previously used to work, and when it doesn’t break things it gratuitously changes them. But fortunately I don’t currently use any Android apps for “real” work.

              I would posit: Developers, especially the casual developers cultivated in the app/webapp space, have shown themselves unwilling to do the right thing, and it’s cost many man-years of user time.

              1. 12

                Users are certainly not guilty of anything wrong. If they’ve done anything, it’s learned that the software we give them is outrageously fragile, and once they get it in a state where they can use it, they should never touch it ever or something will break and they’ll be unable to use it. Except now even that’s not enough, and systems will break themselves without user interaction, so the only solution is fuck you we developers got ours, I guess?

                1. 3

                  You said it better than i did! thanks!

            2. 2

              hyperboot is a library that intends to allow for versioning and user control of those versions for single page apps. Potentially much harder (and expensive) for server rendered applications.

            3. 4

              Personally, I agree with Google’s approach. I’m not saying that they’ve never made mistakes, or that I even like the direction Chrome is going (I usually use Firefox specifically because it’s more resource and battery friendly), but most users simply will not keep their software up-to-date without the program doing it automatically. And outdated programs are much more dangerous in the long-run than the occasional bug being pushed out in an update. I think offering different update tracks is probably a happy medium. Letting users opt to get slightly delayed updates, or set their software to check less frequently. VirtualBox comes to mind in that regard.

              I think that the author’s frustrations are understandable from a user perspective, but if we’re working to promote a safer and more secure internet, I think we can all understand the need to deal with the occasional bug in order to enjoy a more up-to-date user base.

              1. 2

                I also stopped using chrome for my daily browser a while back, but not because of the auto-updating. I found it pretty painless and didn’t run into the same issues he did.

                That said, Chrome was starting to feel sluggish and bloated, taking up more and more memory with no extensions. Right now I use chrome for debugging client-side stuff, otherwise I use Firefox. The only plugin I use in firefox is ublock and it’s quick, stable, and an overall good user experience.

                1. 7

                  Right now I use chrome for debugging client-side stuff

                  Can you point to anything in particular that you’re missing from Firefox’s tools? We’ve put a ton of effort into them over the last year, and it helps to know what gaps are sending people back to Chrome’s tools.

                  1. 1

                    The debugging tools work well enough for me, the big improvement I’ve noticed recently is that scrolling on Mac got much better in the last update. Still a long way from Safari but it’s not as jarring as it used to be.

                    1. 1

                      I’m going to try using them tomorrow while I work and I’ll let you know any particulars I can come up with. I can’t think of any at the moment.

                  2. 1

                    I think it’s truly ridiculous that Google Chrome apparently has internal backdoors such that even after following the official instructions and turning off autoupdates for good, that the software still somehow gets updated in the future nonetheless.

                    I’ve encountered the same problem, too (although with a slightly later release of Chrome); I, too, tried everything to stop it from autoupdating itself; I, too, can confirm that after several months of success at it, it somehow started updating itself all over again, all of a sudden.