UNIX and workstations died in the 90s, and the Windows monoculture took over almost completely until Linux was stable/usable (early 2000s)
The move to server side programming opened up a great number of new possibilities.
The introduction of new high-capability clients (iOS, Android, etc.) opened up a great number of new possibilities.
The lack of gains in single core speeds coupled with the sudden growth in number of cores (which was exactly 1 from 1960 all the way until 2006).
Nonetheless, as I recently wrote elsewhere: “The sad reality is that 99% of the work being done today is using languages from the previous millennium (from oldest to newest): C, SQL, Objective C, C++, Perl, Object Pascal (later: “Delphi”), Visual Basic, Python, R, HTML & CSS (not really a PL), Ruby, PHP, Java, JavaScript, and C#.”
I can’t speak too much for all the reasoning, but there was definitely a lull between 1980-2005 or so. In 2000 your choices of programming languages were C, C++, Java, Perl, and maybe Python or Delphi if you were bold. From what I’ve seen Perl/Python/Ruby, Go, C# and to a lesser extent functional languages deserve a lot of credit for breaking us out of this rut.
The 80s were a lot more vibrant than it might look! OOP languages were popping up everywhere, Pascal was the second most popular popular request on Usenet job boards, and the fifth generation computing project was well underway. The big question was whether C++, Smalltalk, Ada, or Eiffel would be the future of OOP, and if OOP even had a chance compared to the bright future of logic programming.
Most of these trends were rendered moot by the 1997 javapocalypse.
Fair enough! I didn’t live through those times so my perspective on them may be incorrect. I started learning programming in 2001, in the wastelands of the post-dot-com bubble.
There were definitely other languages in that time period. Pascal was huge in the 80s. So was BASIC — every kid who learned to code in the 70s to early 80s started in it. Forth was popular on micros and for device control (especially if you count PostScript as a Forth dialect.). Fortran had a pretty iron-clad grip on scientific computing. I’m not sure when TCL was invented but it had a heyday too.
Ruby was first released in 1996 and started getting noticed outside Japan c.2000. I remember discovering it in 2002.
One big reason for the explosion of languages is just that it’s much easier to build them now. Faster CPUs make interpreters more practical without heroic optimizations. And there are a lot of tools for parsers, compilers and code generators that didn’t exist back in the day, like Clang.
There totally were other langs, but Basic and Forth died with tiny micros, Pascal/Delphi got its lunch slowly eaten by VB and C#, and so on.
Edit: Now that I think of it further, the Windows monoculture taking over the desktop really created an evolutionary bottleneck, similar to mass extinctions of species. The things that survived best are the things that worked well in the Windows monoculture. Then as other have pointed out, the move to server-side programming becoming a bigger deal enabled the ecosystem to diversify again.
Writing application programs used to mean writing desktop software. And in desktop software there is a big bias toward writing the application in the same language as the operating system. And so ten years ago, writing software pretty much meant writing software in C. Eventually a tradition evolved: application programs must not be written in unusual languages. And this tradition had so long to develop that nontechnical people like managers and venture capitalists also learned it.
Server-based software blows away this whole model. With server-based software you can use any language you want. Almost nobody understands this yet (especially not managers and venture capitalists). A few hackers understand it, and that’s why we even hear about new, indy languages like Perl and Python. We’re not hearing about Perl and Python because people are using them to write Windows apps.
That is, from about 1980 to 2000, personal computers i.e. “desktop software” were the big growth area in the computer industry. This is when computers became ubiquitous in offices. (Honestly it’s funny to imagine a time before that, when people drove to work, sat down at a desk with no computer, and wrote memos and made phone calls or something.)
As Graham says, you pretty much used whatever the OS vendor gave you – BASIC, C, assembly etc. I think pretty much all word processors and spreadsheets were written in a handful of languages.
Those were the killer apps that got people to buy computers, similar to how today someone might buy a phone to use messaging apps, Facebook or Google (and not even have a computer).
Then there was a big shift to web software, which greatly opened up the possibilities
There were lots of debates about whether “real” software could be written for the web, but that question has now been answered (although the result is worse along some important dimensions)
There were research languages that ran on maybe one non-desktop computer in a university during 1980-2000, but I think “practitioners” (like most of us) basically hadn’t heard of them and didn’t care about them. For most of that period, the web didn’t exist, so you would probably have to go to a university library and get hard copies of academic papers to learn anything about those languages.
there was a period where the research tradition in programming languages nearly died outright in the US, and a lot of our more senior researchers remember this very keenly. Basically, AI winter and cheap microcomputers killed the Lisp machine, and as collateral damage managed to nearly wipe out PL research in America
I’m curious about this assertion, partly because it seems to equate PL research with LISP. Surely academics worked on other languages too? Xerox PARC alone produced Smalltalk, as well as less-unconventional languages like Cedar and Mesa.
And it’s not like you had to have a Lisp machine to run Lisp. Universities were overrun with Sun boxes by the late 80s and those ran Lisp pretty well.
It seems as though the resurgence in languages was driven more by the corporate creation of the fairly-conventional Java (interpreted C++ with GC, to a first approximation) and by enthusiasts creating interpreted languages like Python and Ruby. The papers cited here may have inspired academics, but it took a long time for that to filter into the wider world — type inference seems to have been a big factor behind the resurgence of statically typed languages like Go, Rust, Swift, etc.
Universities were overrun with Sun boxes by the late 80s and those ran Lisp pretty well.
I wasn’t around at the time, of course, but, so I hear, they did not.
Java (interpreted C++ with GC, to a first approximation)
Java has been compiled since the start, or very close to.
type inference seems to have been a big factor behind the resurgence of statically typed languages like Go, Rust, Swift, etc.
None of the languages you cite has proper type inference, and furthermore, type inference predates all three by decades. Even c++ had local type inference.
I will argue to the grave that full-program type inference is, at best, vaguely counterproductive for large programs. But there is also a huge difference in use and in implementation between the “3 can be an integer of some kind or maybe a float” type inference C and (older?) C++ does and being able to infer generic types for higher-order functions like Rust and Swift can.
I think other key innovations that got into mainstream statically typed languages are generics and first-class functions. With those you end up being able to do a lot of the things that would otherwise be easier in dynamically-typed languages.
I was around. That flame you linked to is undated and doesn’t say what Sun box nor Lisp implementation. But I didn’t use Lisp on a Sun myself so I can’t counter. I just know people did. I can attest that Suns ran rings around Xerox workstations in Smalltalk-80 (except the crazy $70K Dorado that almost no one outside PARC had.) Now get off my lawn ;)
JIT and AOT are not the same. A JIT is still an interpreter, it just has a different execution strategy. And it took a couple years for Java JITs to appear, let alone good ones like HotSpot. Java’s legacy as an interpreted language really has hobbled it in some ways, since they basically froze the bytecode instruction set and class-file format in 1995, which made inner classes and generics less efficient.
I didn’t say Go etc. have “full Hindley-Milner all-singing all-dancing type inference”. Their parsers infer types in limited but very useful ways. This was not a feature found in mainstream static-typed languages earlier. C++ had no type inference before C++11 added auto type. My point is that having this made static typing easier and more convenient to use, taking away some of the appeal of dynamic languages.
A few fundamental things happened:
UNIX and workstations died in the 90s, and the Windows monoculture took over almost completely until Linux was stable/usable (early 2000s)
The move to server side programming opened up a great number of new possibilities.
The introduction of new high-capability clients (iOS, Android, etc.) opened up a great number of new possibilities.
The lack of gains in single core speeds coupled with the sudden growth in number of cores (which was exactly 1 from 1960 all the way until 2006).
Nonetheless, as I recently wrote elsewhere: “The sad reality is that 99% of the work being done today is using languages from the previous millennium (from oldest to newest): C, SQL, Objective C, C++, Perl, Object Pascal (later: “Delphi”), Visual Basic, Python, R, HTML & CSS (not really a PL), Ruby, PHP, Java, JavaScript, and C#.”
I can’t speak too much for all the reasoning, but there was definitely a lull between 1980-2005 or so. In 2000 your choices of programming languages were C, C++, Java, Perl, and maybe Python or Delphi if you were bold. From what I’ve seen Perl/Python/Ruby, Go, C# and to a lesser extent functional languages deserve a lot of credit for breaking us out of this rut.
The 80s were a lot more vibrant than it might look! OOP languages were popping up everywhere, Pascal was the second most popular popular request on Usenet job boards, and the fifth generation computing project was well underway. The big question was whether C++, Smalltalk, Ada, or Eiffel would be the future of OOP, and if OOP even had a chance compared to the bright future of logic programming.
Most of these trends were rendered moot by the 1997 javapocalypse.
Fair enough! I didn’t live through those times so my perspective on them may be incorrect. I started learning programming in 2001, in the wastelands of the post-dot-com bubble.
There were definitely other languages in that time period. Pascal was huge in the 80s. So was BASIC — every kid who learned to code in the 70s to early 80s started in it. Forth was popular on micros and for device control (especially if you count PostScript as a Forth dialect.). Fortran had a pretty iron-clad grip on scientific computing. I’m not sure when TCL was invented but it had a heyday too.
Ruby was first released in 1996 and started getting noticed outside Japan c.2000. I remember discovering it in 2002.
One big reason for the explosion of languages is just that it’s much easier to build them now. Faster CPUs make interpreters more practical without heroic optimizations. And there are a lot of tools for parsers, compilers and code generators that didn’t exist back in the day, like Clang.
There totally were other langs, but Basic and Forth died with tiny micros, Pascal/Delphi got its lunch slowly eaten by VB and C#, and so on.
Edit: Now that I think of it further, the Windows monoculture taking over the desktop really created an evolutionary bottleneck, similar to mass extinctions of species. The things that survived best are the things that worked well in the Windows monoculture. Then as other have pointed out, the move to server-side programming becoming a bigger deal enabled the ecosystem to diversify again.
IMO these 2001 notes get at what changed:
http://www.paulgraham.com/langdes.html
That is, from about 1980 to 2000, personal computers i.e. “desktop software” were the big growth area in the computer industry. This is when computers became ubiquitous in offices. (Honestly it’s funny to imagine a time before that, when people drove to work, sat down at a desk with no computer, and wrote memos and made phone calls or something.)
As Graham says, you pretty much used whatever the OS vendor gave you – BASIC, C, assembly etc. I think pretty much all word processors and spreadsheets were written in a handful of languages.
Those were the killer apps that got people to buy computers, similar to how today someone might buy a phone to use messaging apps, Facebook or Google (and not even have a computer).
Then there was a big shift to web software, which greatly opened up the possibilities
There were lots of debates about whether “real” software could be written for the web, but that question has now been answered (although the result is worse along some important dimensions)
There were research languages that ran on maybe one non-desktop computer in a university during 1980-2000, but I think “practitioners” (like most of us) basically hadn’t heard of them and didn’t care about them. For most of that period, the web didn’t exist, so you would probably have to go to a university library and get hard copies of academic papers to learn anything about those languages.
I’m curious about this assertion, partly because it seems to equate PL research with LISP. Surely academics worked on other languages too? Xerox PARC alone produced Smalltalk, as well as less-unconventional languages like Cedar and Mesa.
And it’s not like you had to have a Lisp machine to run Lisp. Universities were overrun with Sun boxes by the late 80s and those ran Lisp pretty well.
It seems as though the resurgence in languages was driven more by the corporate creation of the fairly-conventional Java (interpreted C++ with GC, to a first approximation) and by enthusiasts creating interpreted languages like Python and Ruby. The papers cited here may have inspired academics, but it took a long time for that to filter into the wider world — type inference seems to have been a big factor behind the resurgence of statically typed languages like Go, Rust, Swift, etc.
I wasn’t around at the time, of course, but, so I hear, they did not.
Java has been compiled since the start, or very close to.
None of the languages you cite has proper type inference, and furthermore, type inference predates all three by decades. Even c++ had local type inference.
I will argue to the grave that full-program type inference is, at best, vaguely counterproductive for large programs. But there is also a huge difference in use and in implementation between the “3 can be an integer of some kind or maybe a float” type inference C and (older?) C++ does and being able to infer generic types for higher-order functions like Rust and Swift can.
I think other key innovations that got into mainstream statically typed languages are generics and first-class functions. With those you end up being able to do a lot of the things that would otherwise be easier in dynamically-typed languages.
I was around. That flame you linked to is undated and doesn’t say what Sun box nor Lisp implementation. But I didn’t use Lisp on a Sun myself so I can’t counter. I just know people did. I can attest that Suns ran rings around Xerox workstations in Smalltalk-80 (except the crazy $70K Dorado that almost no one outside PARC had.) Now get off my lawn ;)
JIT and AOT are not the same. A JIT is still an interpreter, it just has a different execution strategy. And it took a couple years for Java JITs to appear, let alone good ones like HotSpot. Java’s legacy as an interpreted language really has hobbled it in some ways, since they basically froze the bytecode instruction set and class-file format in 1995, which made inner classes and generics less efficient.
I didn’t say Go etc. have “full Hindley-Milner all-singing all-dancing type inference”. Their parsers infer types in limited but very useful ways. This was not a feature found in mainstream static-typed languages earlier. C++ had no type inference before C++11 added
auto
type. My point is that having this made static typing easier and more convenient to use, taking away some of the appeal of dynamic languages.