1. 48
  1.  

  2. 43

    This is unfair to 90s Apple, who spent a lot of time in that “wilderness” period trying to push UI in all sorts of new ways. Dylan. OpenDoc parts. CyberDog. MPW worksheets were the ancestors of the notebook interfaces that are lauded here (but why can I not get CommanDo in a modern shell?) The Newton. There is a long list of inventive approaches. It’s not that they didn’t try. It’s not that they thought users were stupid. It’s that none of it resonated with humans like the Mac-style WIMP did.

    It is unfair to application makers, who opened up vistas with their “demo-tier” efforts. PageMaker and QuarkXpress revolutionised the worlds of print and journalism. Photoshop birthed an industry, as did Director. Avid, Final Cut Pro, AutoCAD, hell, PowerPoint and Excel. Mathematica. HyperCard. iTunes. The web browser. All of them “fragile, inflexible”. None of them composable or pipable. All of them opinionated. But powerful tools all the same. Some literally changed the world.

    It is unfair to us, the users. It’s not Apple who treat us as stupid, it is the post - acting like if only we weren’t dumb we would reject these interfaces in favour of something “better”. But better at what? The WIMP did just enough to be a platform for other things and got out of the way. If we ended up with JavaScript when we could have had Lisp, well, there are reasons that pattern repeats.

    And finally, it’s just outdated. No, we don’t really have “the document” anymore. We barely have WIMP these days. The primary computing paradigm doesn’t have windows, a mouse or pointers. For better or worse we have touch interfaces. And there are a whole different set of rants to be written about those. But we don’t find solutions to their problems by “going back to the Alto”.

    1. 11

      I criticized the Macintosh project specifically – for failing to introduce new ideas & for failing to reproduce interesting and useful parts of the thing they were copying. I do not extend this criticism to the Newton team, or to the NeXT project (which involved substantially the same set of people as the Macintosh did).

      Lots of interesting UI ideas appeared in the late 70s and early 80s. It’s a shame to me that the only one that has heavily influenced modern design is the one that’s least novel & most obviously flawed.

      The worst parts of the Macintosh interpretation of Alto ideas infest the web & touch interfaces.

      The problem is that the interface doesn’t “get out of the way” unless you submit to someone else’s methods. The basic flexibility of the computer as a general purpose machine is broken by the application model, in which each application is even more opinionated than any hand tool can be. Users don’t, as a rule, drop into a flow state from comfort with their applications: they struggle with forcing applications (whose behavior they aren’t allowed to directly change) to produce some semblance of the desired behavior, and then blame themselves for ‘not being good with computers’ when they are defeated by the developer’s lack of foresight.

      Ultimately, market success is not a good proxy for design quality. There are too many confounding factors. Sufficiently good marketing will allow a bad design to survive indefinitely (and sometimes even to succeed) while bad marketing will sink a good design; injections of cash from investors will allow even unprofitable companies to survive indefinitely, and frequent cash injections by people whose preferences have little to do with the market or the product are extremely common in this industry; a bad design by a big or successful company can be forced through & held afloat by other, more profitable products. These phenomena (familiar from recent situations with Twitter, Google, & Facebook) aren’t new: the Amiga was a victim of these circumstances too, famously.

      1. 4

        Having programmed the Amiga in the 90s [1] there isn’t that much of a difference between the Amiga and Mac OS. They both had menus across the top of the screen [2] and they both had controls like scrolls bars and buttons [3]. They both had a desktop metaphor. Had the Amiga survived the mid-90s, you might have well ranted against its GUI as much as the Mac.

        Why did the Mac survive and the Amiga die? Many reasons. One, Commodore couldn’t sell ice to Africans. Two, while the hardware was impressive for 1985, the GUI was too tied to it to and thus by the mid-90s, the hardware wasn’t special and in some way, lagged behind other systems. And it’s “killer app” was too niche to make it profitable (video production). The Mac was able to evolve with the times (more color, higher resolution) coupled with it’s “killer apps” (desktop publishing, graphics imaging) it was able to survive (getting Jobs back at Apple certainly helped Apple [4]).

        Your article might have been better with examples of a better UI that you think should be possible. As it stands, the article kind of reads as “it’s all crap! Start over!” but without any guidance. We can’t read your mind.

        [1] It was a joy to program for. From the 68000 to the programmable hardware to the OS it was a complete joy to work with.

        [2] The Amiga menu would only show up when you pressed the right mouse button, and you could include images as part of the menu but that’s the only real differences between the two.

        [3] The Amiga had three primitive “gadget” types—the boolean gadget, the proportional gadget and the text gadget, of which all other controls (like a scroll bar) could be made. But that’s the issue—all you had were the “atoms”—the programmer was responsible for building up a scroll bar or a drop-down combo box.

        [4] And with him now gone gone, Apple seems to have lost its way again.

        1. 2

          When I compare the Amiga to the Macintosh, I’m generally comparing the Amiga 1000 to the first-generation Macintosh & the Macintosh Plus. The two machines existed at the same time, but the Amiga had high-resolution color graphics while the Macintosh had monochrome (not even greyscale) & the Amiga had double the horsepower at half the price. It’s not so surprising that Amigas weren’t selling great in the early 90s, after decades of mismanagement; it’s more surprising that the Amiga 1000 didn’t blow the Macintosh out of the water & totally murder the entire Apple brand in 1985.

          Your article might have been better with examples of a better UI that you think should be possible. As it stands, the article kind of reads as “it’s all crap! Start over!” but without any guidance. We can’t read your mind.

          I didn’t really expect such a big, general audience for this. I just pieced together bits of text I had already posted on Mastodon & SSB in order to create a companion piece for all the other stuff I’d written on the subject. Some of those cover historical systems that I think are underappreciated (though I’m planning to write a lot more about that), while others cover rules & principles that I think would produce better interfaces if followed.

          I literally put this together, showed it to the folks on mastodon who have been arguing with me about UI design for two years already, and went to bed. When I woke up it had four thousand views. So, I’ve been spending the day trying to reintroduce context for the folks who came to it without reading the previous material.

        2. 1

          I didn’t mention market success, which is flawed as a metric just as you describe. I said that of all the attempts at interesting UIs (which by no means stopped in the early 80s), only the Macintosh-esque WIMP (and later the iOS-esque touch interfaces) resonated with users. That is, people were drawn to them because they enabled them to get the things done with computers that they wanted to get done. Some businesses transformed that into market success, but that was arguably a by-product.

          People, it turns out, are willing to “submit to someone else’s methods” in exchange for getting their jobs done. You seriously beg the question when you argue they don’t drop into a flow state — having watched professionals use Excel, Quark, Photoshop, and 3d Studio Max, I’d argue that’s precisely what they do.

          Are users giving up some ideological, hypothetical, flexibility by submitting to the tyranny of the application model? Possibly. Should they care? Until there’s some compelling example of what new thing they could achieve by resisting the application model, no.

          1. 6

            For every user I’ve seen drop into a flow state when using Excel, I’ve seen ten equally experienced users spend the entire time frustrated with it. And, for every time I’ve seen a user drop into a flow state using Excel, I’ve seen them spend hours trying to re-articulate a problem into a convenient tabular model ten times.

            As programmers, we all know how nice it is to be able to completely rebuild our environment to suit the problem we’re trying to solve. This is most of what programmers do: make abstractions that lower the difference between our mental model of a problem and the underlying infrastructure. And, when we try to do the opposite & solve a problem with a set of mental tools that don’t fit, we end up tired and miserable, banging on a buggy and unmaintainable piece of crap.

            There isn’t really a gulf between non-technical user & programmer – there’s a continuum. Most people are perfectly capable of stepping further into the space between user & programmer than they are currently allowed to by the walled-garden structure of applications. For instance, inter-process piping is a lot easier to conceptualize than the kinds of hacks that self-identified non-programmers regularly invent to solve common problems using spreadsheets; inter-process message-passing by gesture (like with the Alto) is conceptually similar to how musicians chain effects pedals & the only place it’s commonly used is music programs.

        3. 8

          There’s longer piece waiting to be written about the disappearance of WIMP. It’s a pretty ground shaking change in how we use computers.

        4. 14

          Lots of lukewarm to negative feedback here. I actually rather enjoyed the article and think it points out a number of truths.

          You can quibble about there no longer being a W or M in WIMPs but most modern touch interfaceds are just WIMPs interfaces modulo windows and mouse pointers.

          At the birth of any industry there is an openness and lack of structure that breeds a kind of innovation that is INCREDIBLY hard to attain once the “archetypes” have been set in stone.

          1. 9

            So, uh, what’s better?

            1. 15
              • composable GUIs (like the alto & modern smalltalk environments)
              • notebook interfaces (like jupyter & mathematica)
              • literate programming interfaces (like swyft / the canon cat)
              • plan9
              • menuet
              • NeWS
              • language-based systems like interim
              • modern unix shells like zsh
              • borderline-obscure stuff like zigzag, sometimes

              And, of course, I’ve been working on a prototype of the system I pitched last year.

              The thing about interfaces is, if you put what you’re already accustomed to out of your mind, you start to think of alternatives pretty quickly – and many of them are better for certain tasks than what you already use.

              For my next book I’m planning to write a survey of alternatives to the WIMP paradigm that can be easily run or emulated by readers. Unfortunately, I’m going to need to do plenty of research if I’m to seriously discuss the internals of these systems, since a lot of them are language-based systems built around languages I don’t know very well (like lisp or forth) or at all (like holy c or oberon).

              1. 5

                I’m interested in your research. is there any place where I can keep on track with it?

                1. 4

                  I’ve barely started researching for the book in question, so I’m not sure to what extent chapters & other content will be made available before it’s finished.

                  The last book was mostly compiled from stuff I had already published on Medium. If you follow me there, you’ll probably get at least some of the material intended for the next one – maybe even rough drafts for chapters. Also, a lot of the chapters from the last book were inspired by or adapted from discussions I’ve had on mastodon or SSB, & this will probably be true of the next one: if you follow me on mastodon, no doubt you’ll get a preview of some of the ideas I’m playing with.

                  If there’s enough interest, I might make a point of posting about the systems I’m researching on a more regular basis. Those posts will probably end up on Medium too.

                  1. 2

                    Also, I’m going to be posting resources here as I find them during my research.

                    1. 2

                      Thank you for this. I’m going to follow your work on it.

                2. 7

                  I’d assume the biggest problem is overlapping windows. I’ve been using tiling window managers since 2012 and I would not go back. If you look at all the newer mobile operating systems (iOS, Android, that failed Windows 8/10 UI thing), they’re all either single app at a time or, at most, split screen.

                  I guess a second thing is steering people away from mouse dependence. Hotkeys should be easily discoverable and easily encouraged. A higher learning curve at first can mean faster operation later on. Great example: Autozone. Watch staff look up a part today. They do a lot of clicking and switching back and fourth. The old setup was all terminal based and had the same searching/information. I think the new GUI still has a lot of keyboard shortcuts, but very few people I’ve watched use them.

                  1. 5

                    Overlapping windows are pretty pointless when they can’t be made to work together in interesting ways. Drag & drop between unrelated applications is the minimum interoperability to justify even supporting overlapping windows in my eyes (and support for that is pretty rare), but I’d be all about overlapping windows if freeform composition with gestures was a standard part of the toolkit. Even then, tiling is preferable on large displays once we’ve settled on an arrangement & interconnections.

                    Support for tiling and pseudo-tiling (and quick-switching mechanisms) is something I don’t have a problem with in modern desktops, though. Even Microsoft and Apple have been pretty quick to jump on that bandwagon.

                    1. 5

                      Tiling windows seems like a funny point since we’re complaining about UIs treating users as stupid. The first version of Windows was tiling only because users were too stupid to handle overlap and might lose track of a hidden window, but eventually it was decided that users could be trusted with the responsibility of arranging things in their own. Round the circle we go.

                      1. 1

                        The first version of Windows was tiling not because of contempt of users, but to avoid a lawsuit from Apple (who did get overlapping windows working because they thought the Alto had it when it didn’t really). Also, a tiling window system is easier to write than overlapping.

                        1. 1

                          Alas, I’m not sure of the reference, but they apparently had the feature, tested it, users were confused, and it was pulled.

                      2. 4

                        I think we’re slowly moving away from mouse-oriented approach, for better or worse. I’d personally wish for keyboard-oriented desktop UI, but judging by how much Microsoft and Apple are striving to unite all their systems software-wise (eg Windows Phone and Xbox One both running Windows variants, or audioOS etc being iOS-based), we might expect a move towards touchscreen-oriented UI on desktops instead. (Although I guess that goes as far back as GNOME 3 instead.) On the other hand, there exist a minority of mouse-oriented UI advocates, such as Rob Pike and his followers. He argues that mouse is more intuitive and faster, and the problem lies in bad UI design instead.

                        1. 6

                          On the other hand, there exist a minority of mouse-oriented UI advocates, such as Rob Pike and his followers. He argues that mouse is more intuitive and faster, and the problem lies in bad UI design instead.

                          i still think that acmes mouse interface for common editing operations is better than keyboard shortcuts (see http://acme.cat-v.org/mouse). the way the mouse is used in most other systems is bad though. the windows way is nearly unusable with the popup-menu thing, and X11 is only saved by the primary selection insert with the middle mouse button ;)

                      3. 4

                        Presumably some form of programming-is-first-class system like the Alto, where everything you can click is also explorable (“view source” is built in) and extendable via SmallTalk. On the one hand I’m a bit sceptical and think not many users will really make use of this, on the other hand if you see how far some regular (i.e. non-programmer) users take, say, Excel and VBA scripting, having this programmability available pervasively by default in every application would definitely empower users much more than “closed” systems like the original Mac do.

                        I have no idea how many people use AppleScript, which ostensibly brings pervasive programmability to the Mac. It wasn’t part of the original Mac OS and is about programming scripts “on the outside” onto or “against” existing applications rather than full-fledged inspection and modification of internals “inside” those same applications.

                        1. 3

                          The only “modern” OS I know that makes use of hypertext self-documentation is… TempleOS. It also blurs the line between “using” and “programming”, like Alto and LispMs It’s not entirely user-friendly, but I guess it fits the bill.

                        2. 8

                          Ask not the polemic for what is better; it is merely the shallow well in which the author steeps their discontent.

                        3. 4

                          “The applications cannot be combined together or used in tandem, because the user wouldn’t be able to conceptualize the idea of two things working together anyhow.”

                          Wasn’t the ’84 Macintosh notable in part because it allowed pictures from the painting program to be directly copied and pasted into the word processor program? I think that was pretty revolutionary at the time.

                          1. 4

                            The Alto provided a gesture based system for combining independently developed currently-running applications, popping up a configuration panel for handling message passing between them & setting up triggers for those messages. From that panel, the source code of the applications in question can be edited.

                          2. 5

                            So, the author just shits on basically what helped computers and other computing devices to be usable even by kids and elderlies and would like to “pretend it never happened”, simplifies all the UI experiments as “that Alto demo” - clones (or as WIMP) but provides absolutely 0 suggestions on anything that could be better.

                            I actually enjoyed better a previous article on how most, if not all, our OSes are still very strongly tied to old office concepts (desktop, folders, even terminal/consoles). That was a constructive shift of how a designer could see the “modern” UIs and perhaps be inspired in trying something new.

                            This was just negative and wanting to diss Apple, Jobs and capitalism (???).

                            1. 3

                              I was almost expecting the author to add that ssh is definitely better than remote desktop interfaces.

                              1. 3

                                Author here. When I am using the command line anyhow, I certainly prefer ssh to rdesktop because I get less lag & can fit the content to the actual size & shape of my display. I only rarely need to control an X application remotely, thank goodness – X forwarding is even laggier over a public network than rdesktop.

                                I’m not sure what any of that has to do with the subject at hand – the importance of focusing on the space of possible user interfaces that might usefully apply to a particular problem, rather than getting mired in path-dependence and opting for awkward use of familiar systems – though.

                              2. 3

                                Don’t make me think is almost exactly what I want when using a tool, whether it’s a guitar, a synth, a car or a computer. Show me a tool that can make me think less about it and more about what I want to achieve and I’ll give it a try.

                                1. 4

                                  The user should only need to think about the things they came to the interface to think about. In the case of a musician, they’re focusing on melody, or improvisation, or some higher level structure & don’t want to be distracted by the keys sticking.

                                  With a musician, they do years of rote practice to get enough experience with an inherently awkward interface to be able to put the interface out of their mind. That’s exactly the kind of thing we don’t want to require of computers, since computers are general purpose machines: they can do anything & look like anything, so being awkward enough to require years of training and limited enough to only do a handful of things is stupid.

                                  In practice, “don’t make me think” doesn’t actually allow users to get into the groove in computer work. Instead, the developer’s imagined version of the ideal way through the user’s task is neither natural nor obvious to the user, and too limited to work for the entire domain of the user’s tool use. The user therefore needs to think like a hacker by default and create a set of awkward workarounds for performing important tasks in applications that aren’t intended to perform them (or to duplicate functionality that’s buried in an even more awkward set of metaphorical leaps & is therefore totally undiscoverable).

                                  Formal training & instruction manuals can solve the problem of discoverability (but, of course, requiring or even providing instruction manuals violates “don’t make me think”). Nevertheless, the problem of flexibility remains.

                                  1. 1

                                    I think the fact that a lot of paradigms in OSX haven’t changed since I first touched the Aqua beta in 2000 has given me 18 years of not really worrying too much about where things have moved. Unlike Win10 which in spite of using windows since 3.1 I find a mixture of searching for a setting / searching for where that was, over and over. Usually without the benefit of a search bar that works (ie that doesn’t search the internet instead of my computer).

                                    I think flexibility in interfaces can be seen in the level of power using, but I’m not looking to recreate my computing environment generally. If I’m coding it’s for very specific purposes, but not composing things in a graphical way. Nevertheless the ability of windowed computing environments to have windowed computing environments in them means I can run linux, windows, etc inside my Mac and not really suffer too badly from lack of flexibility.

                                    I don’t mean to say that I hope all innovation stops with the WIMP interfaces we have, but I don’t think that being locked into a system that works consistently for almost two decades is a problem for me. Jumping into the latest KDE, GNOME or whatever is always the same story as windows above. Some number of things are out of place and broken when it comes to muscle memory.

                                    As much as I love new controllers for music, I usually have gravitated towards one that mimic the control I’m used to. Take for instance the Linnstrument that is ‘tuned’ like a bass guitar, really easy to intuitively understand. Besides the command line I don’t have a computer user interface that I’ve used for over a decade, other than oddly the UI this post is angry at. I like being comfortable w/ what I know being where I left it.

                                    1. 1

                                      I’m not in favor of systems being experimentally changed by third parties & imposed upon users. User interface design for personal computing (as opposed to institutional use) should be under the control of single end users. This means keeping interfaces the same when they work properly, but providing the tools necessary to fix poor problem-fit when desirable (and making those tools accessible & discoverable).

                                      Composition is an example of a mechanism that can be added to a WIMP system without changing behaviors users are familiar with (although by definition it requires a full rewrite of the applications & underlying GUI system) while solving a number of common problems that are normally solved without the aid of automation through repetitive & error-prone work. Users frequently use sequences of programs in tandem to perform stages of transformation when one program isn’t capable of performing a transformation but another is – using photoshop to crop an image before inserting it into a word document, for instance. Composition makes it possible to transclude the working copy of something in one program into another program without the knowledge or permission of the authors of the original program – for instance, linking the image editor to the embedded image such that modifications appear immediately. Because it’s fairly concrete & not too dissimilar to existing mechanisms, it’s worth using as an example: it’s low-hanging fruit.

                                      In situations where an improved UI is obvious and the domain has heavy intersection with developers, we actually do see novel UI designs integrated. Notebook interfaces are pretty common in systems intended for use by mathematicians, for instance. But, I think it’s important that everybody be able to scratch their own itches.

                                      There are a couple of interacting factors I’m criticizing, & it’s hard to cover them all in one essay while doing them justice.

                                      One factor is a developer and designer monoculture: few developers have enough UI ideas in their toolkits to easily imagine appropriate UI designs for particular problems, particularly when they are also trying to imagine the problems from an outsider perspective. As a result, they fall back on familiar patterns even when those patterns produce pathological results. This is the problem I have with WIMP – not that it is flawed (it is, but so are all the alternatives) but that because so few designers are aware of or can imagine alternatives to it, its flaws become totally inescapable, even when escaping them should be trivial.

                                      Another problem is an artificial & politically-enforced developer-user divide: users are not expected to modify the applications they run to their own liking, and the tools we use don’t make it straightforward for non-technical users to gradually ease into modifying their own applications to their liking. Instead, users are expected to “become developers” if they want to modify the applications they use, read lots of documentation, maybe take a class, and eventually work up to being able to control simple graphical primitives in awkward ways. Then they are expected to spend lots of time in deep study of the existing implementation, propose & test a change, and offer it up to the maintainers. If the maintainers don’t like the change, the user is going to have a hard time continuing to use it, because software is expected to scale & therefore even applications run on personal computers are mass-produced, with periodic new versions not designed to support heavy user-side modifications.

                                      The developer-user divide has economic causes & economic effects, but the clearest argument that it can be eliminated is that we, as developers, live our entire time as users mostly on the other side of it: we know how to modify our own applications, and we do it; we maintain our own versions when upstream doesn’t like them, and we merge changes as we like. Our ability to live in a computing environment that suits us comes out of having already passed the handful of trials that gave us the knowledge to do this: we learned how to use autotools, and we learned c and c++, and we learned the GTK and QT and Tk APIs enough to be dangerous. We still sometimes decide that making a change to an application is more trouble than it’s worth. The difference between us and a non-technical end user is only that fixing something, for them, is always too much trouble because they haven’t cleared the prerequisites Lowering unnecessary barriers helps us just as much as it helps them.

                                  2. 4

                                    A musician friend of mine had a good argument for the Kaossilator; there’s no real reason to use a centuries-old UI for synthesizing music.

                                    Maybe you can’t construct a grand piano with that UI, but it is easier to make something Just Sound Good with a Kaossilator than a trad. keyboard synth.

                                    1. 1

                                      Oddly the theremin is about a century old and in some ways (if you stretch it) is more like a kaossilator (xy for pitch / volume but instead done in ‘r’ from the antennas). But yeah, I certainly don’t want innovation to end, but I really enjoy the comfort of things I know working the way they always have, then customizing as needed.

                                    2. 2

                                      But the problem is training. I don’t notice the editor I use now—it’s completely invisible to me for the most part. But when I was first learning it? Oh I hated it. It worked differently from the editor I was used to [1] and it took a few years for it to become invisible to me (like the previous editor I used). And that’s the point—if it’s easy to use from the outset, it’s not powerful enough to do everything I want.

                                      A car is conceptually easy to operate, but we still have to be trained to drive. A program is a tool—you have to learn how to use it before it becomes second nature.

                                      [1] Which only ran under MS-DOS. It’s limited to 8.3 filenames. It doesn’t understand directories. And lines are limited to 255 characters. It’s still my favorite editor. Sigh.

                                    3. 4

                                      Not listed here: http://www.lord-enki.net/medium-backup/

                                      And that’s too bad, because while @enkiv2 has some writing worth reading, the cognitive dissonance of reading it on a site like Medium really sullies the experience for me.

                                      1. 4
                                        1. 1

                                          medium

                                          I also don’t get what’s so all fired bad about Medium. Sure, the ecosystem would be healthier if everyone self hosted their own blogs, but:

                                          A) Not everyone wants to. B) Medium has become successful because they’ve created an ecosystem that actually adds a lot of value for end users. I still host my own blog, but I really rather enjoy Medium and do a lot of good content reading there. Clearly I should be tarred and feathered :)

                                          1. 10

                                            The ratio of words to bytes for a medium post is kinda bad.

                                            1. 2

                                              That’s a valid criticism. They’re particularly style heavy.

                                              Honestly asking because I’m hardly a web tech expert - does thatr actually impact the average user experience negatively in any meaningful way?

                                              1. 6

                                                Well, whenever I go to visit medium without a proxy, like an average user, I’m greeted by a pop up saying “hello visitor we’re tracking. You seem to come here a lot. Wouldn’t you like to sign in?” That’s kind of negative.

                                                Images are lazy loaded. I’m not sure if that’s blessing or curse, given the propensity to litter articles with meaningless memes, but it means if I open a link in a new tab, then decide to read it later without a net connection, there are no graphs, etc. unless I manually page through the whole thing to make sure it loads. Do average users create reading lists for airplanes and subways?

                                                So there’s two more random gripes.

                                                1. 3

                                                  I appreciate your gripes as a Medium user (although, as a registered user, I never see any pop-ups at all).

                                                  The reason I still use Medium is that it’s the only major service that adopts the idea of users making direct micropayments to each other as a monetization model for long-form text & an alternative to advertisement. Eliminating the ad-tech ecosystem & its pressures on content is something I care about even more than I care about avoiding unnecessary bloat.

                                                  I feel like the UX team at Medium hasn’t gotten the memo that they aren’t doing ads & therefore certain things (like promoting already-popular content with algorithmically sorted feeds) simply don’t make as much sense. They ought to act more like Netflix or Spotify & throw people into the deep end of available content while making sure they see just enough of the stuff they’re already expected to like to keep them subscribed – and the easiest way to do this is to bring back the reverse-chronological feed of followed publications & users as the primary interface for registered users. All their revenue comes from paying users, and so trying to drive more traffic from those folks is actually losing them money (and clearly they’re not doing a great job of getting new users to register or non-paying users to subscribe, either).

                                                  1. 1

                                                    Thanks for the thorough explanation. My own aversion to Medium is only in part due to the bloat, the nagging, and the vast quantity of rubbish. Underlying that is my basic opposition to centralized publishing per se. There’s simply no need: we can all own a printing press nowadays, you don’t even need to be all that tech savvy.

                                                    I predict that micropayments will fail Medium just like they’ve failed everywhere else. If you want ad-free monetization, use Patreon or something. Or make it easy to buy your books. I would throw you some bones via either channel, but there’s no way this side of hell I’m giving Medium my email address.

                                                    1. 2

                                                      I’m investing a bit of effort into looking for alternatives, but most semi-decentralized systems seem to be either impression-based or heavily centrally integrated. (I also use patreon – though not for my essays – & I consider it just as centralized as Medium, on top of having the general problem that I would get paid for everything rather than having the payments be linked to fine-grained feedback about quality.) I considered implementing a transcopyright system, but I’d need too much buy-in to start one myself.

                                                      1. 2

                                                        I consider [Patreon] just as centralized as Medium

                                                        Also Paypal, Venmo, Square, VISA, and…. uh, the Federal Reserve? I don’t object to centrally managed payment systems nearly as much as centrally managed publishing. I’m a little surprised that this distinction doesn’t seem sharper to you. Anyway, it’s a mere quibble. Keep up the good work!

                                            2. 6

                                              Medium is just a bad reading experience. There are popups asking for your email. There are position: fixed headers and footers and sidebars and buttons asking you to get the app. The text is way too big (which is a commin issue; websites think they should increase the font size to make the text easier to read, which punishes people on systems where the default font size is their preferred font size. iOS also doesn’t let you easily decrease the font size like ctrl+minus on a desktop would.)

                                              I know there’s reader mode, but a) that shouldn’t be necessary, and b) I don’t trust reader modes a lot after way too many experiences of feeling that an article is a bit short or kind of empty and discovering that reader mode excluded all code blocks or certain pictures or certain paragraphs.

                                              1. 1

                                                See, this is exactly the kind of critique that amuses the crap out of me.

                                                Are you a web developer? Or do you at least do web work generally?

                                                Because your comments make me think you do.

                                                Paraphrasing:

                                                “The text is too big”

                                                REALLY? I’m partially blind, and I can’t tell you how many times I’ve been totally, utterly unable to use a site or other app/technology because the fonts are too darn small and I can’t enlarge them! So this one just DRIPS with irony for me :)

                                                That said though, given the above, I am incredibly sympathetic to any site / layout that keeps you from adjusting the style of your web browsing experience to exactly the way you want it.

                                                Here’s where my ignorance shows - I’m not a web dev. I write UGLY UGLY web pages.

                                                Is it even possible to deliver the kind of experience most non programmer end users want while still providing this kind of total customizability?

                                                1. 1

                                                  I’m mainly a C++ programmer, but have done a lot of web programming, some professionally and some not.

                                                  The ideal way to make sure the text is readable, in my opinion, is to stick with default font faces and text sizes for body text. On my personal website, all body text is the default size, which means it’s the user’s web browser’s responsibility to make sure the text is readable. The web browser knows a lot more about the user’s situation (screen pixel density, screen size, potentially the user’s accessibility needs, etc.) than I do, and users with impaired vision can customize the default text size.

                                                  Say you increase the default font size, such that web pages which use the default font sizes are legible. If a web page then independently tries to make text readable for people with poor vision by doubling the size of the text, you might yourself find the text to be annoyingly large; you’re comfortable reading the default font size, because you’ve adjusted it to be something you’re comfortable reading, and then a website tries to “help” by making the text twice as big as what you’re comfortable with.

                                                  I completely agree that web pages with too small text is an issue, but the problem also goes the other way. iOS specifically makes the issue apparent because iOS already defaults to pretty large text. An iPad in landscape showing text with its default text size is at the upper end of what I find comfortable, so websites which significantly increase the font size from the default size becomes almost unreadable for me. Also on iOS, there is no way to zoom out of a web page, so the only fix is to enter the system accessibility settings and decrease the font size there, just to read a single web page.

                                                  Here’s my personal web page, for reference: https://mort.coffee/home/ - I’m not a web designer, and don’t know a ton about design, but it should have enough contrast for the most part, and the text should be readable if you can comfortably read your browser’s default font. It would be interesting to get your opinion on it, and on whether my ideas around this works out in practice.

                                          2. 2

                                            I mean questioning Apple design guidelines goes back decades https://www.nngroup.com/articles/anti-mac-interface/

                                            I really love this kind of work. For me the issue is that the UI innovations we see on the web and mobile never came back to the Desktop. On the web ui and application logic really have been decoupled for many services. On mobile with the Android and IOS APIs there is an amazing amount of interoperability.

                                            But why hasn’t the Desktop gotten this stuff. Why are applications there so silo’ed?

                                            1. 1

                                              Neither of those things are true.

                                              Android and iOS apps are extremely siloed. On Windows or Linux, the end user can unconditionally access all persistent data through the file browser. On the new mobile platforms, everything is sandboxed, and the only way to get data between apps is with explicitly-added “intent” routing that’s keyed by mimetype to guarantee that it’ll never do anything that the developer didn’t intend.

                                              Worse, you claim that there’s been no movement of idea from mobile to desktop, both completely ignoring Windows 8 (which tried to merge them entirely), and ignoring the fact that the new versions (Windows 10, macOS Sierra) have adopted features like the app stores, pervasive sandboxing, and global notification mailbox.

                                              1. 1

                                                So I haven’t used Windows in decades so I had no idea what was going on there. While you might view the intents as a restriction I think they are a richer interaction medium than files. The sandboxing is a good feature and defines clear ways applications gather resources and communicate. This is much better than ad-hoc protocols between programs that are closed and proprietary.

                                                1. 2

                                                  I own an iPhone. I have pictures I want to get off the device. Ideally, I would hook the phone up to my Linux box, have it show up as a storage device and use cp to move the files to Linux (and then rm to remove them off the iPhone). But no, I can’t do that. I have to hook the iPhone up to the Mac, run Photos, import them into Photos [1], then export the “full” images [2].

                                                  Yes, I am not a typical user. But Apple has made it hard to use the flow I’ve used for twenty years.

                                                  I hate hate hate this siloed, intent based system Apple and Google are forcing upon us.

                                                  [1] I don’t even want to use Photos. I have my own system for storing photographs I’ve developed in the late 90s.

                                                  [2] The keyboard shortcut used to export the full data. Now it helpfully strips EXIF data, thus I have to use the menu to explicitly select “full” of which there is no keyboard shortcut.

                                                  1. 1

                                                    That does suck. What I’m surprised about is why not use an application that exposes the API you want? On Android, I make use of Airdroid quite a bit.

                                                    I know everyone thinks I’m wrongheaded in my top remark, but really all I was trying to say is that efforts in mobile have tried to move away from WIMP. I don’t think we have a great solution yet. Even programmer-friendly interfaces like Emacs have serious discoverability problems.

                                                    1. 1

                                                      The “application” I want to use is cp and rm, both of which are standard “applications” under Unix. I can script them so it happens pretty much automatically (by also using the mount application to make the “files” on the camera visible to Linux).

                                                      But nooooooo! There’s some new, media specific USB protocol used to suck images down from smart phones (and tablets). My older digital cameras work the old way (showing up as a storage device, files and all).

                                            2. 2

                                              For those of you that had the priviledge to use LispM or Symbolics or Smalltalk systems, where is good resource to learn more about them? If you could share your experiences, I would be glad to read them.

                                              1. 3

                                                Smalltalk systems

                                                Having your own experience would bring you more value. Try http://pharo.org/

                                                1. 1

                                                  Thank you for this. It looks really interesting.

                                                2. 1

                                                  Squeak ships with an interesting desktop interface (whose name I keep forgetting) based on some really deep thinking about the possible descendents of the Alto’s interface we never got. (It also ships with a technically-interesting 3d collaborative environment called Croquet, but aside from certain things like the use of buttons to control the user’s camera & align/center to an object, Croquet is largely not applicable to 2d interfaces.) I find those elements more interesting than the smalltalk language itself, personally.

                                                  Regarding lisp machines, there’s a modern scheme OS called interim that you might want to try out. I don’t know how it stacks up to symbolics machines, but it’s an interesting project.

                                                  1. 1

                                                    an interesting desktop interface (whose name I keep forgetting)

                                                    It’s called Morphic. It was ported from Self, a Smalltalk descendant with many interesting and somewhat influential ideas, including JIT compilation. Dan Ingalls ported most of it to Javascript and called it the Lively Kernel.

                                                    1. 1

                                                      Thanks! I didn’t realize Morphic had originally been in Self. The last time I played with Squeak was more than ten years ago, & that would be the last time I read the Morphic documentation as well.

                                                    2. 1

                                                      I’ve read about interim a while ago (right here in lobste.rs nonetheless) and liked the idea a lot, but haven’t had the time to actually use it, might fire a virtual machine and give it a try. Thank you for the other suggestions.

                                                  2. 2

                                                    Stopped reading at the casual hatred of capitalism. Can we get a philosophy article without throwing out the dog-whistle of capitalism being bad?

                                                    1. 8

                                                      I’ve covered the special relationship between capitalism & software extensively elsewhere. I didn’t elaborate in this article because I didn’t expect it to become popular outside my regular readership (who will already be familiar with those arguments).

                                                      In addition to the stuff covered above, there’s the obvious precedent of cybersyn. Of course, eliminating capitalism doesn’t require eliminating markets (as cybersyn does), & despite the various problems with markets, it’s unclear whether or not doing so would even be desirable in capitalism’s absence. After all, markets can be pretty good for solving certain kinds of information problems so long as the prerequisites for market efficiency are fulfilled. On the other hand, almost all economic activity on earth occurs within corporations or families (both of which are siloed planned economies) & attempts to bring markets into corporate silos have largely been disastrous, so it’s worth considering cybersyn’s progeny seriously.

                                                      1. 2

                                                        Next time just link the phrase to your previous article that explains it best, so that it doesn’t appear to be a random comment.

                                                        1. 3

                                                          I’ve got an awful lot of other writing related to every subject I cover here. You’ll have to excuse me if I don’t link every word to a different article when tossing off a low-effort rant I expected to get at most ten readers. Criticism of capitalism is among the least controversial subjects I cover in this.

                                                      2. 9

                                                        What is it about the author’s dislike of capitalism that invalidates their opinions about UX design?

                                                        1. 10

                                                          The casual injection into a post that I was reading to find out about his opinions on UX design.

                                                          1. 6

                                                            it’s their article, not yours. If they think it’s important, they can write whatever they want :)

                                                            1. 1

                                                              It may be their article but we are allowed to critique it. Nobody’s imprisoning the author for the way he writes, but by the same token, no one is obligated to read what he wrote if the style drives them away.

                                                              1. 3

                                                                Then why not just hide and ignore? How do the OP reply’s not equate to tantamount compaining and not serious discussion? what do you intend to accomplish with this reply?

                                                                EDIT: also, it’s funny I got downvoted as “incorrect” at the same time as your reply…

                                                              2. -6

                                                                It might have made some sense with context. Now it did not.

                                                                A big part of capitalism is providing the supply for a demand. If something won, there’s a market demand for it, right?

                                                                It might be suboptimal, and change can be hard to enact, but would it be better if every computer was an autistic LISP machine, utterly unapproachable for a layman?

                                                                1. 4

                                                                  Can you seriously not use “autistic” as an insult?

                                                                  1. 3

                                                                    ….what?

                                                                    1. 2

                                                                      This post includes, in five sentences, a severe misunderstanding of markets under conditions of near monopoly, some pretty extreme ableism, and the straw man fallacy.

                                                                      Please, reconsider.

                                                                  2. 3

                                                                    I’m with zdsmith and the others here. It’s OK to have this as a pet peeve, but really, just put that aside and evaluate the ideas being presented for what they are. That’s my suggestion.

                                                                2. 14

                                                                  It sounds to me like the OP is responding to an insufficiently-filled market need to shit on capitalism, and I commend them for responding so quickly to the invisible hand.

                                                                  1. 5

                                                                    Also, I don’t think you’re using the phrase “dog-whistling” correctly.

                                                                    1. 2

                                                                      No. Calls for the death of capitalism and the adoption of fully automated luxury space communism are all the rage these days.

                                                                      And while I started writing this as snark, the truth is it’s a reality, especially in certain quarters like Mastodon where thousands of kids who’ve likely not experienced actual hardship ever seem to predominate.

                                                                      1. 7

                                                                        And while I started writing this as snark, the truth is it’s a reality, especially in certain quarters like Mastodon where thousands of kids who’ve likely not experienced actual hardship ever seem to predominate.

                                                                        Except that the ‘kids’ you talk of have actually experienced far more hardship than any previous generation that still lives. Growing up in a massive recession, living in a world where they have no privacy and many have no expectation of privacy, where they’re allowed to own mobile phones as children despite it being objectively proven that this is incredibly harmful to their psychological development, living in a world where all collectivism in society has been snuffed out by the unstoppable march of neoliberalism.

                                                                        If you can, imagine having your once almost guaranteed job replaced by outsourcing to Asia so the very rich who were already far too rich can make even more money. Imagine having your previously completely free tertiary education replaced with unbelievably expensive tertiary education but of far worse quality with universities filled with foreign students that waste tutor and lecturer time by being virtually unable to communicate in English. Imagine having your Government’s public works department privatised and its job of building sufficient housing to keep house prices at a reasonable level completely abandoned, leading to some of the most expensive housing in the world in a low population density first world country with more than enough land.

                                                                        If you were in those shoes I imagine you’d consider yourself to be subject to some level of bloody hardship thank you very much.

                                                                        1. 3

                                                                          Except that the ‘kids’ you talk of have actually experienced far more hardship than any previous generation that still lives.

                                                                          Do you realise that there are still survivors of the Second World War alive? Survivors of the Holocaust? Survivors of the Cultural Revolution?

                                                                          No, millennials haven’t ‘experienced far more hardship than any previous generation that still lives,’ not even close. Not even a little bit.

                                                                          1. 6

                                                                            Um… Can we please not have sweeping generalizations about the life experiences of entire generations here? Or pissing contests about hardship? The hyperbole to which you are responding is severely oversaturated, but the basic point is sound. Genuine hardships exist at every level in the mythical Maslow hierarchy. Studies have shown that grad students suffer the same stress levels (measured by both Likert scale and cortisol levels) as combat soldiers. People who live through major natural disasters and other forms of severe crisis generally report feelings of peace and social communion. People adapt, it’s how our nervous systems work. Exercise some compassion!

                                                                            1. 3

                                                                              I guess you’ve never met any from other parts of the world who isn’t from the United States, or other other affluent and unravaged countries.

                                                                              1. 1

                                                                                Did you mean to reply to milesrout? Every one of my examples of people who’ve experience far more hardship than millennials have was from outside of the United States.

                                                                                Or do you mean that ‘millennials’ is a term usefully applied to non-Western cohorts? I think that would be a rare usage. Still, while there’s some pretty horrific stuff going on the world today, I don’t think it compares to the Cultural Revolution or the Holocaust.

                                                                                1. 1

                                                                                  I was replying to you; you brought up non-Western comparisons, and I’m pointing out that your attempt to minimize current ills is unsound.

                                                                                  It’s undeniable that Millennials, and all other post-Boomer cohorts in the United States, have had declining opportunities and quality of life, due to structural issues related to unregulated and sociopathic economic policy and behavior (see https://www.scientificamerican.com/article/the-american-economy-is-rigged/). So what’s your deal? Why are you trying to gaslight us?

                                                                                  1. 1

                                                                                    It’s undeniable that Millennials, and all other post-Boomer cohorts in the United States, have had declining opportunities and quality of life

                                                                                    I’m not arguing against that statement: I’m arguing against the statement ‘the ‘kids’ you talk of have actually experienced far more hardship than any previous generation that still lives.’ That statement is false, because there are generations still living which have experienced far worse hardship than the Millennials. Whatever hardship they face pales in comparison to mass slaughter, mass murder, mass starvation, mass conscription &c. &c. &c.

                                                                                    That’s not gaslighting: it’s a simple fact.

                                                                                    1. 0

                                                                                      I believe it is incorrect to exclude the rest of the world in the Millennial cohort; the problems they face are global. Especially since you keep bringing up non-Western-world examples like the Cultural Revolution. And Millennials and younger are now facing down the barrels of a bunch of giant ecological cannons, and the world is turning into an authoritarian hellscape, so again, your insistence on minimizing how rightfully pissed they should be is literally incredible.

                                                                          2. 3

                                                                            And this is bad because?

                                                                            1. 4

                                                                              It’s absolutely not “bad” - did I say that?

                                                                              No, what I said is that I see a lot of people yearning for a particular bit of societal change, and sometimes I question whether or not they appreciate the fullness of what they’re asking for.

                                                                              1. 3

                                                                                In particular, I am selfishly worried that given that kind of massive, wholesale seismic shift in the way we structure our lives that basic infrastructure would fall away for a time.

                                                                                I’m dependent on a couple of key medications that aren’t all that common to continue existing on the prime material plane, so despite the fact that I LOVE the idea, I’m a bit cautious around what it would ACTUALLY mean to march into our glorious future with my comrades, possibly dying of dehydration along the way. (The drug I need is vasopressin replacement. Without it I dehydrate and die. Full stop.)

                                                                                1. 3

                                                                                  Countries with socialised medicine do far better at providing people with medicine than those without. I struggle to see why it’d be reasonable to expect socialism to do poorly at providing medicine.

                                                                                  1. 3

                                                                                    Countries that have socialized medicine where a person with disorder like GP has survives are capitalist.

                                                                                    1. 2

                                                                                      What does that have to do with what I said?

                                                                                      1. 1

                                                                                        This whole branch discusses the “calls for death of capitalism”, and you mention socialized medicine as a counter argument. Now, why do you make me explain your post to yourself?

                                                                                      2. 1

                                                                                        Countries that have socialized medicine where a person with disorder like GP has survives are capitalist.

                                                                                        I can’t even parse this. What are you saying? Socialized medicine is socialism. Western countries are a mix of socialized services (education, roads, trains, military) and private, market-based systems. The mix has historically shifted back and forth, and right now, we’re at an extremely capitalistic phase, and it’s too much.

                                                                                        Capital is useful, like fire. Demanding that we worship it and asserting that capitalism is the Only Way is like demanding that firefighting be outlawed, because fire is good.

                                                                                        1. 1

                                                                                          Socialized medicine in socialist countries is atrociously bad. The GP would not have survived there with the kind of disorder they have. I am saying that because I lived in a poster boy socialist country with such healthcare system.

                                                                                          All Western countries are decidedly capitalist, their economies are based on proceeds from capitalist mode of production. Back in my history class in USSR we had that political map of the world, they were marked there as such.

                                                                                          I hope you aren’t suggesting that the USA is the only capitalist Western country, since all others have socialized healthcare of some sort.

                                                                                          1. 1

                                                                                            Really, you’re saying the medicine in the Netherlands, and Australia, and Canada, and Sweden, etc. is atrociously bad? Because I know for a fact that the systems there are better than in the United States.

                                                                                            Again, socialized medicine, like socialized military or education, is socialism. All the Western democracies are a mix of socialism and capitalism.

                                                                                            The United States is more capitalistic than the other ones; I am saying it needs to be less capitalistic than it currently is.

                                                                                            1. 0

                                                                                              Really, you’re saying the medicine in the Netherlands, and Australia, and Canada, and Sweden, etc. is atrociously bad?

                                                                                              I am saying that medicine in Marxist societies was (and is) bad. There is a world of difference between socialized aspects of Sweden and Soviet socialism. They have nothing in common, nada, nilch. If you think USSR was like Sweden but just poorer and with more socialized services, no, it was nothing like it at all. In fact from that perspective Sweden is undistinguishable from the USA. I know because I’m familiar with both, and a former Prime Minister of Sweden agrees.

                                                                                              1. 1

                                                                                                No one was talking about Soviet-style Marxist Communism, which we all agree was a nightmare. The argument was, “Too much has been subject to capitalism,” (which originally sprang from the OP’s note that we still have capitalism, meaning, there is still scarcity and inequality), or, “There should be more socialism,” which has nothing to do with the dysfunction in the USSR.

                                                                                                1. 1

                                                                                                  Fair enough. I was going off “calls for the death of capitalism” upthread, have nothing against socialized healthcare per se.

                                                                                                  1. 1

                                                                                                    But why does a call for moving beyond capitalism automatically invoke, “I guess you want to try something terrible, like a USSR or DPRK style nightmare?”

                                                                                                    Capitalism, like controlled fire, is a human tool meant to bring about humane ends. When fire rages out of control and people get hurt, we put it out. When capital rages out of control and people get hurt, for some reason a lot of people get mad when you say, “Maybe common and critical needs shouldn’t be subject to market dynamics,” and I just don’t understand that reaction.

                                                                                                    1. 1

                                                                                                      Because you hardly hear that call from anyone else than communists, and USSR/DPRK was the outcome of people giving their best to build communism.

                                                                                                      1. 1

                                                                                                        Fully Automated Luxury Gay Space Anarcho-Communist Syndicalism does not suffer from the flaws of the attempts from 100 years ago; we will have automated labor this time ;)

                                                                                                        Also, if you have an ounce of awareness and you live in the San Francisco area, it’s impossible to not be confronted with catastrophic failure of capitalism as a total system (meaning, attempts to provide all needs via markets).

                                                                                                        1. 1

                                                                                                          All kinds of societies can thrive once you remove human factor!

                                                                                      3. 1

                                                                                        Did you read what I read in full?

                                                                                        In particular, I am selfishly worried that given that kind of massive, wholesale seismic shift in the way we structure our lives that basic infrastructure would fall away for a time.

                                                                                        Of course once a fully marxist / communist society was attained medicine would be a non problem for most people, my issue is the transition. Do you actually think we could just pivot from full on rape and pillage capitalism to such a society without massive upheaval, bloodshed, and interruption of all but the simplest infrastructure services (like the manufacture and delivery of specialized medication for instance.)

                                                                                        1. 1

                                                                                          Of course once a fully marxist / communist society was attained medicine would be a non problem for most people

                                                                                          Don’t count on it. We had root canals treated without anaesthetics in 1980s USSR.

                                                                                          1. 3

                                                                                            This is exactly the point I was trying to make. I’m seeing a LOT of people extolling the virtues of Marxist / communist societies without thinking through how hard they are to actually implement in ways that benefit the common citizen.

                                                                                            For a really great trove of data on how this can go totally awry, read the book Red Plenty.

                                                                                            Many then cite successful implementations of universal healthcare in socialist countries, failing to acknowledge the fact that many (all?) of these are fueled by thriving capitalist economies.

                                                                                            I acknowledge that I am a cis white male working in technology and currently enjoying a lifestyle practically dripping with privilege, but this has not always been so, and I also feel that just because I have never known hardship (especially not the kind of hardship experienced by millennials, apparently) but that doesn’t mean I can’t talk about the flaws in people’s thinking.

                                                                                  2. 3

                                                                                    If you’re going to go that far you should go all the way: fully automated luxury gay space communism.

                                                                                    1. 4

                                                                                      Sure why not? With flying cars, even! :)

                                                                                      (In all seriousness, Ian Banks Culture novels represent pretty much the ONLY far future society I’d actually WANT to be a citizen of :)