1. 59
  1.  

  2. 14

    The world needs more people willing to do research and summarize their findings.

    1. 9

      I’ve been fascinated by the history of computing for a long time. This quote from the article sums up part of why, and also challenges me to try to source my information more:

      I’ve said this before, and I’ll say it again: whatever we think about ourselves as programmers and these towering logic-engines we’ve erected, we’re a lot more superstitious than we realize, and by telling and retelling these unsourced, inaccurate just-so stories without ever doing the work of finding the real truth, we’re betraying ourselves, our history and our future.

      1. 6

        I did a deconstruct talk on this! It’s not up yet, but the theme is that everything in tech has a historical story often removed from our current understanding. My main example was linked list interview questions: why do we ask them? You can make a very strong case they were originally a shibboleth for if you knew C programming, then got recontextualized as being “about algorithms” when people stopped using C for everything.

        1. 4

          Really looking forward to seeing your talk! But… it’s not just tech. By and large, this is how the world works.

      2. 8

        People still find it shocking to learn that your tty behaves under the assumption there’s a typewriter attached instead of a screen. This is a great post.

        1. 7

          It’s not entirely clear if this is a deliberate exercise in coordinated crank-wank or just years of accumulated flotsam from the usual debate-club dead-enders hanging off the starboard side of the Overton window. There’s plenty of idiots out there that aren’t quite useful enough to work the k-cups at the Heritage Institute, and I guess they’re doing something with their time, but the whole thing has a certain sinister elegance to it that the Randroid crowd can’t usually muster. I’ve got my doubts either way, and I honestly don’t care to dive deep enough into that sewer to settle them.

          You don’t need a conspiracy here! I’ve run into the same problem with a lot of articles on circa-1970s CS. The problem? Carl Hewitt. The guy who first invented the actor model. Any page that is connectable to PLANNER or the actor model has at some point been vandalised by him. The wiki admins has to ban his official account twice and like 15 of his sock puppets.

          Over time his edits start getting a bit more subtle, not to the point of undetectable, but enough to pass under the “not worth the effort” threshhold. It’s totally feasible for one obsessive person to trash up a swath of Wikipedia. And there’s a lot more than one obsessive goldbug out there!

          You can get a rough sense of how trustworthy a wiki page is gonna be by its talk page. The more angry flamewars, the less you should trust it. Angry flamewars are just about the most egregious stuff, so they indicate more stuff slipped past the radar.

          1. 1

            [citation needed]

          2. 1

            I don’t have anything substantial to add right now, so here are my favorite quotes for posterity:

            Fascinatingly, the early versions of the ECMA-48 standard specify that this standard isn’t solely meant for displays, specifying that “examples of devices conforming to this concept are: an alpha-numeric display device, a printer or a microfilm output device.”

            A microfilm output device! This exercise dates to a time when microfilm output was a design constraint! I did not anticipate that cold-war spy-novel flavor while I was dredging this out, but it’s there and it’s magnificent.

            As a personal aside, my two great frustrations with doing any kind of historical CS research remain the incalculable damage that academic paywalls have done to the historical record, and the relentless insistence this industry has on justifying rather than interrogating the status quo. This is how you end up on Stack Overflow spouting unresearched nonsense about how “4 pixel wide fonts are untidy-looking”. I’ve said this before, and I’ll say it again: whatever we think about ourselves as programmers and these towering logic-engines we’ve erected, we’re a lot more superstitious than we realize, and by telling and retelling these unsourced, inaccurate just-so stories without ever doing the work of finding the real truth, we’re betraying ourselves, our history and our future.