I rail against this frequently.
In the interest of fostering discussion^W^Wcomplaining with an audience (but maybe some discussion will result!), here are a few trends, not mentioned in the article, that are profoundly user-unfriendly and which I would very much like to see die:
Interface mutability. My partner uses an iPhone. She was not happy to start using her iPhone, because she had to take time to learn how to use it that she could have spent doing literally anything else, most of which she would have found more productive. (The jump from a landline touch-tone phone to a clamshell cell phone is like climbing a curb compared to the Everest of figuring out a smartphone interface.) But okay, now she’s figured out how to use it, all is well, right? Well, obviously not, because I’m here complaining about it. Some years later, Apple pushed iOS 7 to her phone, and it rearranges, redesigns and shuffles everything. Now she has to relearn how to use her phone to no discernible benefit, because Apple decided the previous interface was insufficiently shiny and/or confusing. What the fuck. How many millions or billions of dollars of damage did Apple due to the world’s economy with that change? Because I got to experience secondhand at least a few hours of wasted time and frustration due to it.
And of course, it doesn’t end there: she recently had to update her laptop (from OS X Lion to Sierra) because, as a medical professional, operating systems without security support obviously won’t fly. And so now she has to relearn how to use her computer. While the learning curve for new Mac OS versions is shallower than the iOS <7→7 curve, Sierra performs terribly on her (nominally supported) laptop. I’m hoping upgrade to an SSD will resolve that for at least a few more years, but if not (or eventually regardless), she’ll have to buy a new laptop not because she needs new features or the old one is wearing out but essentially because Apple mandated it. Great.
While my computer interface (bash, wmii/i3, vim) has been essentially stable for the better part of a decade, the barriers to entry to such an interface are formidable indeed, and the capabilities aren’t sufficient for everyone; my partner, for instance, needs to run proprietary medical record programs, which provide only Windows and Mac OS versions.
(I don’t mean to single Apple out here, by the way; it’s just the example I’ve most recently had significant exposure to. Nearly every interface vendor is guilty. I go to some lengths to insulate myself from popular computing for precisely these sorts of reasons.)
Inconsistency. The article touches on this in the realm of the web (“is this a button? A link? A static label?”), but it’s a cancer that has spread to native interfaces, as well. Is a given element a button? A link? A static label? Who the hell knows? I can’t figure out without clicking on it, and who knows what happens when I do that. The webapp-ification of native interfaces is partly to blame here (way back when the web was actually a web of static pages linked by hypertext, there was a good reason to present web content differently from application interface; now, unfortunately, those conventions have leaked between environments) but I seem to recall once upon a time major interface vendors published HIGs that were either enforced or at least broadly adhered to (and offenders like Winamp were rare and the butt of frequent jokes). That seems to have fallen by the wayside. Google and Apple seem to be trying to bring it back in the mobile interfaces, but they’re doing a bad job of enforcement (even though they’ve given themselves the technical ability to do so!) and their mobile HIGs are bad anyway.
System fragility. You know how many people are terrified of changing their system’s settings? We taught them to feel this way by, in the 90s, presenting them with a multitude of knobs that could destroy their system, requiring them to shell out money to a probably-insufferable technician who would almost certainly make fun of them behind their backs to unfuck things and then quite possibly shell out more money or time to recreate work they lost. Well now we’re well into the 2010s, we’ve learned our lesson, and systems are resilient, present informative warnings at an appropriate frequency and generally enable fearless user operation! Lol, no, of course not. Systems are less likely to fuck themselves now, but regular users are still justifiably afraid of them because they’re still unjustifiably likely to present dangerous options with only jargon to warn you off.
Now, back in the 90s, some of that fragility was just because consumer-level computers weren’t a mature product yet. They had to expose some of the rougher edges of the underlying hardware interfaces, because there wasn’t enough headroom to paper over them effectively. (Not all of them, of course. In no universe should it take me two clicks to erase a disk.) But there’s really no remaining excuse now.
On HIGs: While Macs have had good consistency even from third parties for a long time, on Windows its been a total mess of nothing looking and feeling consistent. The last push for HIG consistency was with Windows 95; UWP might improve this though. At least the X11 desktops are consistent with themselves. (I try to make apps that are good citizens on Windows.)
On browsers: Please take me back to the days of static pages, when browsers were document viewers, not app runtimes.
Adherence to the macOS HIG has eroded noticeably in recent years, even in first-party apps. I agree it’s nowhere near as much of a mess as Windows, but I don’t use it as a point of comparison anymore.
Apple seems to be getting less interested in pushing (or even enabling) third parties to conform to any kind of consistent HIG as well. One of the traditional strengths of the Mac platform for developers was its thorough and consistent documentation, which explained what everything did, why it did it, how pieces fit together, and generally what the Right Way To Do Things was. Now the documentation is all over the map and my recent experience with it has not been very good. Large parts look like basically auto-generated Doxygen style stuff giving you bare-bones class documentation and not much else.
Without any change, there is no progress. I am also amazed at some people I have encountered who actually seemed to simply refuse to learn anything new.
That said, change for changes sake (novelty chasing) is indeed a serious problem in the industry. I wholeheartedly agree.
Maybe a different type of progress?
I think it was The Ultimate C64 Talk where the speaker made the point that hardware moves so fast nowadays, people don’t explore its limits (paraphrased from memory).
There’s often a kind of CADT-style impatience among people, which leads to exasperated comments from my fiancé like “They made Spotify shit again”. I don’t use Spotify, so I’m not sure, but I’ve understood that after the shock of change (“now it’s shit!”) there’s often a meh (“it didn’t get better or worse, just different.”).
So how can the end users know if a change was in any way objectively better when things move faster (and break) than we can explore the limits?
When I have a useful thing, I don’t want it to progress; I want it to keep being the same useful thing.
So you still have a flip phone and a horse?
I find comments like this one frustrating. Some people bloody well do still have flip phones and horses, because they want to and that’s actually just fine. Those things don’t work less well than they used to, and if the owner is happy with it and it isn’t dangerous, I can’t imagine why they should change to something else.
The difference with software updates in newer products like iPhones is that you often need to take the update, or your necessarily connected device will be rife with security holes. But the major updates often screw around with where the buttons are, or how things work. There isn’t really a (safe) choice to just keep the horse or the old flip phone, because in a very real sense they don’t build things that way anymore.
Until a few months ago I had a near-invincible candybar phone. Why not a smart phone?
Sometimes old tech that correctly solves the problem domain simply and reliably is preferable to some damn fool new fancy solution. That’s why the AK family and the Mauser action have been around as long as they have, why Usenet and IRC is still in widespread use after over 20 years, and so forth.
Pull someone from 1967 to today (that’s a leap of 50 years). They know how to drive, they can still drive a 2017 car without much problem since that interface hasn’t changed much. The car radio however? That will take some time (along with the climate controls).
Another thing—from time to time I’ll find some neat feature on Google Maps. At one time, you could select multiple towns and it would highlight each town in light red. I used that feature. Then they removed it. Then they added it back, but you can only do one town at a time. It’s gotten to the point where I don’t want to learn new features for the fear that they’ll be arbitrarily removed because their constant A/B testing shown that not many people use it, or were confused by it, or they just felt like they didn’t want to support it any more.