1. 20
  1. 4

    Very interesting. I had a brief foray into mainframes which involved maintaining a port of OpenLDAP to z/OS; I only lasted about a year at that. If you should care to venture into their spaces, the IBM mainframe folks are very loud about how the machines are still relevant and much better than any “PC” server and how you could have a very remunerative career working with them. Their spaces seem to be pretty out of the way, though; you could also have a very remunerative career without ever hearing or learning anything about them.

    The IBM folks also always talked as if “mainframe” was synonymous with IBM, although I knew there had to have been competitors. I always wondered about that, and if any of those other mainframe platforms were still extant. Thanks to this, now I know!

    1. 5

      Great article. I had something like this on my to-do pile for several years; now, someone else has done it, and I’m both glad and a little frustrated. :-)

      I may use it as a jumping-off point for my own, though.

      I think, for me, what I find intriguing about mainframe OSes in the 21st century is this:

      On the one hand, there have been so many great OSes and languages and interfaces and ideas in tech history, and most are forgotten. Mainframes were and are expensive. Very, very expensive. Minicomputers were cheaper – that’s why they thrived, briefly, and are now totally extinct – and microcomputers were very cheap.

      All modern computers are microcomputers. Maybe evolved to look like minis and mainframes, like ostriches and emus and cassowaries evolved to look a bit like theropod dinosaurs, but they aren’t. They’re still birds. No teeth, no claws on their arms/wings, no live young. Still birds.

      One of the defining characteristics of micros is that’s they are cheap, built down to a price, and there’s very little R&D money.

      But mainframes aren’t. They cost a lot, and rental and licensing costs a lot, and running them costs a lot… everything costs a lot. Meaning you don’t use them if you care about costs that much. You have other reasons. What those are doesn’t matter so much.

      Which means that even serving a market of just hundreds of customers can be lucrative, and be enough to keep stuff in support and in development.

      Result: in a deeply homogenous modern computing landscape, where everything is influenced by pervasive technologies and their cultures – Unix, C, the general overall DEC mini legacy that pervades DOS and Windows and OS/2 and WinNT and UNIX, that deep shared “DNA” – mainframes are other.

      There used to be lots of deeply different systems. In some ways, classic Mac and Amigas and Acorn ARM boxes with RISC OS and Lisp Machines and Apollo DomainOS boxes and so many more – were deeply and profoundly unlike the DEC/xNix model. They were, by modern standards, profoundly strange and alien.

      But they’re all dead and gone. A handful persist in emulation or as curiosities, but they have no chance of being relevant to the industry as a whole ever again. Some are sort of little embedded parasites, living in cysts, inside a protective wall of scar tissue, persisting inside an alien organism. Emacs and its weird Lispiness. Smalltalk. Little entire virtual computers running inside very very different computers.

      Meantime, mainframes tick along, ignored by the industry as a whole, unmoved and largely uninfluenced by all the tech trends that have come and gone.

      They have their own deeply weird storage architectures, networking systems, weird I/O controllers, often weird programming languages and memory models… and yes, because they have to, they occasionally sully themselves and bend down to talk to the mainstream kit. They can network with it; if they need to talk to each other, they’ll tunnel their own strange protocols over TCP/IP or whatever.

      But because they are the only boxes that know where all the money is and who has which money where, and who gets the tax and the pensions, and where all the aeroplanes are in the sky and who’s on them, and a few specialised but incredibly important tasks like that, they keep moving on, serene and untroubled, like brontosauri placidly pacing along while a tide of tiny squeaky hairy things scuttle around their feet. Occasionally a little hairy beast jumps aboard and sucks some blood, or hitches a ride… A mainframe runs some Java apps, or it spawns a VM that contain a few thousand Linux instances – and the little hairy beasts think they’ve won. But the giant plods slowly along, utterly untroubled. Maybe something bit one ankle but it didn’t matter.

      Result: the industry ignores them, and they ignore the industry.

      But whereas, in principle, we could have had, oh, say, multi-processor BeOS machines in the late 1990s, or smoothly-multitasking 386-based OS/2 PCs in the late 1980s, or smoothly multitasking 680x0 Sinclair clones instead of Macs, or any one of hundreds of other tech trends that didn’t work out… they were microcomputer-based, so the R&D money wasn’t there.

      Instead, we got lowest-common-denominator systems. Not what was best, merely what was cheapest, easiest, and just barely good enough – the “minimum viable product” that an industry of shysters and con-men think is a good thing.

      And a handful of survivors who keep doing their thing.

      What is funny about this, of course, is that it’s cyclic. All human culture is like this, and software is culture. The ideas of late-20th-century software, things that are now assumptions, are just what was cheap and just barely good enough. They’ve now been replaced and there’s a new layer on top, which is even cheaper and even nastier.

      And if we don’t go back to the abacus and tally sticks in a couple of generations, this junk, which those who don’t know anything else believe is “software engineering” and not merely fossilised accidents of exigency – will be the next generation’s embedded, expensive, emulated junk.

      What sort of embedded assumptions? Well, the lower level is currently this… quote marks to indicate mere exigencies with no real profound meaning or importance:

      “Low-level languages” which you “compile” to “native binaries”. Use these to build OSes, and a hierarchy of virtualisation to scale up something not very reliable and not very scalable.

      Then on top of this, a second-level ecosystem built around web tech, of “dynamic languages” which are “JITted” in “cross-platform” “runtimes” so they run on anything, and can be partitioned up into microservices, connected by “standard protocols”, so they can be run in the “cloud” at “web scale”.

      A handful of grumpy old gits know that if you pick the right languages, and the right tools, you can build something to replace this 2nd level system in the same types of tools as the first level system, and that you don’t need all the fancy scaling infrastructure because one modern box can support a million concurrent users no problem, and a few such boxes can support tens of hundreds of millions of them, all in something in the corner of one room, with an uptime of decades and no need for any cloud.

      But it’s hard to do it that way, and it’s much easier to slap it together in a few interpreted languages and ginormous frameworks.

      And twas ever thus.

      1. 3

        I randomly talked to a guy, a couple of years ago, who worked on OS 2200 at Unisys (at that time, not like, decades ago). He implied that most of the users were classified government stuff :). Of course I have no idea if that’s true.

        1. 4

          Far more intriguing than it has any right to be.

          1. 1

            Do you mean: “why is this obsolete mainframe stuff interesting?

            I am curious as to why you said this, before I try to reply. :-)

            1. 1

              In retrospect, it’s more like “how did all this stuff exist and I’ve never even heard of it?” I’ve read stuff about IBM and Burroughs systems, but nobody seems to talk about Unisys.

              1. 3

                Note that Burroughs is also Unisys; Unisys inherited both the Burroughs MCP lineup (now called Unisys Clearpath Libra) and the Univac OS 2200 lineup (now called Unisys Clearpath Dorado.) Both are still actively maintained. There’s a sidenote page on Let’s Play OS 2200 about the other surviving mainframe vendors; there are several, with Fujitsu probably being the largest customer base of the non-IBM gang. NEC, Fujitsu, and IBM still do custom mainframe CPUs, while the others rely on emulation (except Hitachi, which rebadges IBM hardware with Hitachi software.)

                1. 1

                  Dammit, Chrome ate my reply. >_<

                  Unisys is long gone and forgotten. It grew from Burroughs and Sperry, both more famous in their time, both also forgotten.

                  And yet…

                  Burroughs’ Large Systems were vastly influential.

                  Their design influenced the researchers at Xerox PARC. It influenced Smalltalk. Smalltalk led directly to the design of the Lisa, the original Mac, and to MS Windows. And of course Windows inspired OS/2 and the UNIX desktop – Motif was built around designs licensed from MS.

                  All modern computer UI designs contain elements traceable back to influential researchers who admired the Burroughs kit and software design.

                  Burroughs’ OS, MCP, was implemented in ALGOL. The first ever OS in a high level language.

                  That idea, that concept, the notion that you could build an OS in a compiled language, led directly to DEC VMS and that led to NT, and another, 3rd party, DEC OS: UNIX.

                  ALGOL is the granddaddy of all modern programming languages. ALGOL is a direct linear ancestor of C, Python, Java, Javascript, Bash, Ruby, PHP, Perl, you name it.

                  It’s much quicker to list the languages not derived from ALGOL: Lisp, Smalltalk, Forth, APL, and that is about it.

                  None of these are major elements of any modern OS, but the UI of all modern OSs owes a lot to Smalltalk, which was influenced by Burroughs.

                  (OK, I am omitting all non-imperative languages here. But then, pure FP languages are also not very influential and not widely used in industry or any modern software.)

                  The other branch… ENIAC has a claim to be the first programmable digital electronic computer. (It wasn’t as it was originally designed, but years later after a lot of re-design and modification.)

                  ENIAC led to EDSAC. EDSAC led to UNIVAC, which ran the EXEC I and EXEC 2 OSes.

                  Elements of both led to EXEC 8.

                  And today, you can go and buy brand new hardware which runs the latest, modern versions of EXEC 8 (now called OS 2200) or Clearpath 20 (which is Burroughs MCP).

                  Sixty year old OSes, still around, still – amazingly – on sale and in maintenance and in active use.

                  1. 1

                    But then, pure FP languages are also not very influential and not widely used in industry or any modern software.

                    Minor correction… most of the Good Shit that programming languages have made common since 2000 or so – immutability, garbage collection, type inference, reasoning about generics, pattern matching, tuple types, channels for IPC(?), etc – were done first in functional programming languages.

                    1. 1

                      Sure, I will give you that. (TBH, I am not a developer and it’s a bit academic to me, but I’ve heard about it.)

                      But it’s been bolted onto languages that started out very much in the ALGOL lineage.

                      I mean, you saw this, right? http://steve-yegge.blogspot.com/2010/12/haskell-researchers-announce-discovery.html

                      About the most widely significant thing I reckon I’ve ever read about Haskell is that the first working prototype of Perl 6 was put together in HUGS. And AFAIK, Raku doesn’t use it any longer.

                      I am aware of one P2P filesharing app that was in OCaml. MLDonkey I think?

                      Otherwise, I find it hard to think of much penetration. Lisp is probably more widespread, especially thanks to Emacs, but Lisp is not a pure FP language.

                      They are 100% out there and people use them, but perhaps sadly, they are a parallel stream of software development, rather than part of either the commercial or FOSS mainline world.

                      Or am I profoundly misinformed?