There’s a siren song that this time we’ll do it right and loper-os contains some great criticisms of today’s technology. Brett Victor and Tunes (now resurrected in Houyhnhnm Computing) also offer valuable glimpses of what could be.
That said, having actual projects matters. It’s much easier to discuss the limitations of today’s tool du jour than to build something better. I’m guilty of this too.
I don’t share your view that today’s designs will not be supplanted though:
I do think the computing mainstream benefits from fringe projects too. V8 and the JVM are both mature projects that have lispers contributing: there’s a lot to be learnt from rebuilding or relearning from older and less popular tech.
Let me simplify the part about economics and inertia in existing tech. Intel themselves couldn’t change the market with new CPU’s. Look up Intel i432 APX, Intel BiiN w/ i960 (great CPU), and Intel/HP Itanium. Collectively major losses. There’s actually two choices these things all boil down to:
Maintain backward compatibility. The transition cost off an ISA, API, etc plus re-creating its ecosystem are the main reason these monopolies and oligopolies form with their lock-in. The only thing you can do if going after these users is re-create the same thing (eg x86) with some differentiator like performance (AMD64), energy usage (VIA’s C3), reliability (Stratus/NonStop), or security (some academics). You’re not getting these people off existing stuff until they want to pay the transition cost. It goes up over time. Earlier the better.
Create new stuff. We actually saw a lot of this with clean-slate software, SaaS, hardware appliances, acceleration SoC’s, and many projects in embedded. The LISP vision or whatever could happen here just like we had FPGA’s, Cavium’s MIPS64 Octeon’s, Java processors like Azul’s Vega, Clojure on JVM, iOS on phones that used to be dumb, and so on. Tons of examples. There just has to be a compelling reason to both acquire and learn to use the new product. A capability worth it. The tech you use is incidental. For this reason, I keep encouraging people wanting to push better development tooling, reliable OS’s, or secure CPU’s to hide them in great products that have nothing to do with that. Then, people will drive increase in uptake naturally via cash flow or adoption of tools to extend them.
“What we need is a new generation of new, smaller systems which by dint of small size and integrity are better able to resist the ravages of time and industry.”
Exactly. I pushed version of Pascal or Delphi back in the day since the languages were small enough to teach, extend, fix, or port. Free Pascal proved that out when Delphi got effectively discontinued. Modula-3 and Component Pascal were similarly small. Alternatives include Scheme, Smalltalk, functional languages, and flow-based programming a la Morrison. They’re all better at changing things in long-term. Much better to get stuck on something like that than COBOL, C, C++, Java, etc.
Perhaps it’s easier to believe there will be major changes over the next decade if one has already experienced multiple decades of change. I don’t see how we get to 2017 without huge changes in the tech landscape. It’s inevitable. I’m not predicting how ideal the changes might be. It will probably be a good bit of “worse is better”. But in the past nothing was ideal in the first place, not even Lisp machines, which were pretty damn sweet.