I think this post somewhat undermines its credibility by casting too wide a net. Everything is shit? Literally nothing good has happened in 40 years?
If I imagine Lewis Black reading this and shouting and sputtering, it’s pretty entertaining. But less informative. It would be great to ask why, but there’s not enough information here to begin making such an assessment.
I am reminded of the old rant that Windows isn’t a real 32 bit OS. It’s actually based on 16 bit OS. Which ran on the 8088 which is really an 8 bit processor. And just a fancy 4004 which is 4 bit. See? I have just proven that Windows is a 4 bit OS.
Now the words “just” and “really” and “actually” are doing a lot of work in that paragraph. I’m not sure if there’s a better term, but I call it “false equivocation.”
(For the record i used Lisp machines and Smalltalk for production software before i used Unix or C.)
The answers are in economics and culture. Pascal and C were far more economical on affordable personal hardware in the 1980s. Apple made reasonable product engineering decisions with the classic Mac. Microsoft did what they needed to follow suit.
Unix was widely available and understood at an economical price point for personal business workstations. NeXT continued the evolution toward the Smalltalk vision in that market. As PC hardware performed better, it was culturally and economically natural to move toward Unix-like systems at home.
Windows had a huge installed base, Apple less so. Both had their share of growing pains toward Unix. Meanwhile, engineering the great Lisp and Smalltalk systems of the 1970s and 1980s cost a lot of time by really good, focussed engineering teams. But generally smart developers out of universities knew Unix and C better, and there were a lot more of them available. Its easier to get a decent Unix system into production unless a fair bit of lead-time is spent on things the marketing department would not understand culturally and most engineers would not know to even fight for.
I fully agree that we’ve lost or have cast-off a lot of things–for example, Plan 9/Inferno are strictly better and more visionary than any OS ‘nix derivative we have nowadays.
At the same time, there’s this weird tendency for engineers and developers to build beautiful Faberge eggs, fragile and articulate and pretty and utterly stunted when pressed into service. The little ugly scurrying mammals and cockroaches outlived everything else, and sure they’re garbage, but they won by default.
The one question that I think the author gets correct is “Why doesn’t anyone remember?”. I am perpetually amazed at how little of our sector’s and our tools' history most developers know.
The Unix vendors had a common enemy in Microsoft, but they also each wanted the remainder of the market for themselves. There was enough incentive to present a unified front, but not enough to actually give up their proprietary distinctions. Until the Linux and x86 wave made too much economical sense to avoid, but that basically destroyed the proprietary Unix hardware market.
Plan 9 made sense to the few individual engineers who cared enough to know about it. But it’s hard to see how a product engineering group at the time could see it for anything more than some ideas.
The success of x86 can be attributed to economics as well. Intel is extremely good manufacturing chips. They were so good theyblinded themselves to power in the mid 2000’s and mobile later. They’re not as good at design, and so today the 64 bit x86 intel manufactures like no one else is an AMD design.
Intel also is an ARM licensee of some degree since their deal with DEC / Compaq / HP. But that deal also led to the disasterous Itanium. Where they go from here in the low end will be fun to watch. Probably won’t be ARM though. Intel had success earlier with i960, etc.
I think this post somewhat undermines its credibility by casting too wide a net. Everything is shit? Literally nothing good has happened in 40 years?
If I imagine Lewis Black reading this and shouting and sputtering, it’s pretty entertaining. But less informative. It would be great to ask why, but there’s not enough information here to begin making such an assessment.
I am reminded of the old rant that Windows isn’t a real 32 bit OS. It’s actually based on 16 bit OS. Which ran on the 8088 which is really an 8 bit processor. And just a fancy 4004 which is 4 bit. See? I have just proven that Windows is a 4 bit OS.
Now the words “just” and “really” and “actually” are doing a lot of work in that paragraph. I’m not sure if there’s a better term, but I call it “false equivocation.”
(For the record i used Lisp machines and Smalltalk for production software before i used Unix or C.)
The answers are in economics and culture. Pascal and C were far more economical on affordable personal hardware in the 1980s. Apple made reasonable product engineering decisions with the classic Mac. Microsoft did what they needed to follow suit.
Unix was widely available and understood at an economical price point for personal business workstations. NeXT continued the evolution toward the Smalltalk vision in that market. As PC hardware performed better, it was culturally and economically natural to move toward Unix-like systems at home.
Windows had a huge installed base, Apple less so. Both had their share of growing pains toward Unix. Meanwhile, engineering the great Lisp and Smalltalk systems of the 1970s and 1980s cost a lot of time by really good, focussed engineering teams. But generally smart developers out of universities knew Unix and C better, and there were a lot more of them available. Its easier to get a decent Unix system into production unless a fair bit of lead-time is spent on things the marketing department would not understand culturally and most engineers would not know to even fight for.
Then the rush to web… and here we are.
I fully agree that we’ve lost or have cast-off a lot of things–for example, Plan 9/Inferno are strictly better and more visionary than any OS ‘nix derivative we have nowadays.
At the same time, there’s this weird tendency for engineers and developers to build beautiful Faberge eggs, fragile and articulate and pretty and utterly stunted when pressed into service. The little ugly scurrying mammals and cockroaches outlived everything else, and sure they’re garbage, but they won by default.
The one question that I think the author gets correct is “Why doesn’t anyone remember?”. I am perpetually amazed at how little of our sector’s and our tools' history most developers know.
The Unix vendors had a common enemy in Microsoft, but they also each wanted the remainder of the market for themselves. There was enough incentive to present a unified front, but not enough to actually give up their proprietary distinctions. Until the Linux and x86 wave made too much economical sense to avoid, but that basically destroyed the proprietary Unix hardware market.
Plan 9 made sense to the few individual engineers who cared enough to know about it. But it’s hard to see how a product engineering group at the time could see it for anything more than some ideas.
The success of x86 can be attributed to economics as well. Intel is extremely good manufacturing chips. They were so good theyblinded themselves to power in the mid 2000’s and mobile later. They’re not as good at design, and so today the 64 bit x86 intel manufactures like no one else is an AMD design.
Intel also is an ARM licensee of some degree since their deal with DEC / Compaq / HP. But that deal also led to the disasterous Itanium. Where they go from here in the low end will be fun to watch. Probably won’t be ARM though. Intel had success earlier with i960, etc.
How so?
DEC used to have Alpha, which, I heard, was awesome at the time. But I think Compaq inherited it, Intel only got their network cards.
See https://en.wikipedia.org/wiki/XScale