1. 7
  1.  

  2. 2

    Still, my main point is that software sucks and so do the people who write it (myself included) and I think that’s true on all OSes.

    The good news is that distributed programming knowledge, including cache design, is diffusing through the landscape. We’ll see more tricks for preventing these failures, both built into our languages and pounded into our brains. And eventually even in our job descriptions.

    This bit really tickles my imagination. Has the software development bell curve progressed to the right or regressed to the left over the past few decades? Does software suck more or less?

    For me, at least, the average program running on my computer today seems especially far from optimal when compared to the average program of twenty years ago. (Not to beat a dead horse, but I’m looking at you especially, Electron-based gadgets.) My attempt at an explanation? Software development has long been a rich discipline requiring knowledge and experience across disparate domains; that’s everything from basic mathematics to, of course, cache design. As time has gone on, the field has only grown deeper and, simultaneously, a lot more popular. Now, there are loads of people involved with software development, and a significantly smaller fraction of them are interested in treating the discipline as anything more than a means to an end. (This is anecdotally confirmed by the way many of my coworkers treat software development in the context of a neuroscience lab.) Thoughts?