1. 16

I had the whim to collect a list of all the major tech hype bubbles I’ve lived through, which turned out a lot trickier than I intended – perhaps because I do my best to ignore them. So I’m throwing the list down here and asking for corrections and additions:

  • 1998-2001: Dotcom Bubble
  • 1999-2006: Java
  • 2004-2007: Web 2.0
  • 2007-2010: The Cloud
  • 2010-2015: Social media
  • 2012-2015: Internet of Things
  • 2013-2015: Big Data
  • 2017-2021: Blockchain
  • 2021-present: AI

Also-ran’s/attempts/minor bubbles:

  • grid computing
  • web3
  • 2012: 3D printing
  • 2015: Autonomous vehicles

Dates don’t have to be precise, but it’d be nice for them to be broadly correct to within a year either way.

  1.  

    1. 11

      This is interesting. Bubble implies a pop, yes?

      1998-2001: Dotcom Bubble

      I mean, things were overhyped and ridiculous, but can anyone say that the internet isn’t at the core of the economy?

      1999-2006: Java

      Still one of the most widely used languages, powering many multi-billion dollar companies.

      2004-2007: Web 2.0

      What we now just think of as “the web”.

      2007-2010: The Cloud

      Again, powering multi-billion dollar workloads, a major economic factor for top tech companies, massive innovation centers for new database technologies, etc.

      2010-2015: Social media

      Still massively important, much to everyone’s regret.

      2012-2015: Internet of Things

      This one is interesting. I don’t have anything myself but most people I know have a Smart TV (I don’t get it tbh, an HDMI cord and a laptop seems infinitely better)

      2013-2015: Big Data

      Still a thing, right? I mean, more than ever, probably.

      2017-2021: Blockchain

      Weirdly still a thing, but I see this as finally being relegated to what it was primarily good for (with regards to crypto) - crime.

      2021-present: AI

      Do we expect this “bubble” to “pop” like the others? If so, I expect AI to be a massive part of the industry in 20 years. No question, things ebb and flow, and some of that ebbing and flowing is extremely dramatic (dot com), but in all of these cases the technology has survived and in almost every case thrived.

      1. 13

        All of these things produced real things that are still useful, more or less, but also were massively and absolutely overhyped. I’m looking at the level of the hype more than the level of the technology. Most of these things involved huge amounts of money being dumped into very dubious ventures, most of which has not been worth it, and several of them absolutely involved a nearly-audible pop that destroyed companies.

        1. 1

          Yeah, I was just reflecting on the terminology. I’d never really seen someone list out so many examples before and I was struck by how successful and pervasive these technologies are. It makes me think that bubble is not the right word other than perhaps in the case of dot com where there was a very dramatic, bursty implosion.

          1. 4

            The typical S-shaped logistic curves of exponential processes seeking (and always eventually finding!) new limits. The hype is just the noise of accelerating money. If you were riding one of these up and then it sort of leveled off unexpectedly, you might experience that as a “pop”.

            1. 5

              See https://en.wikipedia.org/wiki/Gartner_hype_cycle

              To me the distinguishing feature is the inflated expectations (such as NVidia’s stock price tripling within a year, despite them not really changing much as a company), followed by backlash and disillusionment (often social/cultural, such as few people wanting to associate with cryptobro’s, outside of their niche community). This is accompanied by vast amounts of investment money flooding into, and then out of, the portion of the industry in question both of which self-reinforce the swing-y tendency.

        2. 5

          most people I know have a Smart TV (I don’t get it tbh, an HDMI cord and a laptop seems infinitely better)

          “Dumb TVs” are nigh-impossible to find, and significantly more expensive. Even if you don’t use the “Smart” features, they’ll be present (and spying).

          1. 2

            Not for everyone and also not cheap, but many projectors come with something like android on a compute stick that is just plugged into the hdmi port, so unplug and it’s dumb.

            1. 1

              Yeah, I’ve been eying a projector myself for a while now, but my wife is concerned about we’d be able to make the space dark enough for the image to be visible.

            2. 2

              Even if you don’t use the “Smart” features, they’ll be present (and spying).

              That’s assuming you make the mistake of connecting it to your network.

              At least for now… once mobile data becomes cheap enough to get paid for using stolen personal data we are SO fucked.

              1. 1

                Why have a TV though? Sports, maybe?

                1. 2

                  Multiplayer video games, and watching TV (not necessarily sports) with other people.

                  1. 1

                    I use a monitor with my console for video games, same with watching TV with others. I think the only reason this wouldn’t work is if people just don’t use laptops or don’t like having to plug in? Or something idk

              2. 2

                This one is interesting. I don’t have anything myself but most people I know have a Smart TV (I don’t get it tbh, an HDMI cord and a laptop seems infinitely better)

                It’s the UX. Being able to watch a video with just your very same TV remote or a mobile phone it’s much much better than plugging your laptop with an HDMI cord. The same reason why still dedicated video game consoles exist even if there are devices like smartphones or computers that are technically just better. Now almost all TVs sold are Smart TVs, but even before, many people (like me) liked to buy TV boxes and TV dongles.

                And that’s taking into account that a person own a laptop, because the number of people that doesn’t use PCs outside of work is increasing.

                1. 1

                  I have a dumb TV with a smart dongle - a jailbroken FireStick running LineageOS TV. The UX is a lot better than I’d have connecting a laptop, generally. If I were sticking to my own media collection, the difference might not be that big, but e.g. Netflix limits the resolution available to browsers, especially on Linux, compared to the app.

                2. 1

                  massive innovation centers for new database technologies, etc.

                  Citation needed? So far I only know about then either “stealing” existing tech as a service. Usually in an inferior way to self housing, usually lagging behind.

                  The other thing is putting whatever in-house DB they had and making it available. That was a one time thing though and since they largely predate the cloud I think it doesn’t make sense to call it innovation.

                  Yet another thing is a classic strategy of the big companies which is taking small companies or university projects and turn them into products

                  So I’d argue that innovations do end up in the could (duh!) it’s rarely ever driving them.

                  Maybe the major thing being around things like virtualization and related things but even here I can’t think of a particular one. All that stuff seems to steem largely from Xen which again originated at University.

                  As for bubbles: one could also argue that dotcom also still exists?

                  But I agree that hypes is a better term.

                  I wonder how many of these are as large because of the (unjustified part of) hype they received. I mean promises that were never kept and expectations that were never met, but investments (learning Java, writing java, making things cloud ready, making things depend on cloud tech, building Blockchain Knowhow, investing into currencies, etc) are the reason why they are still so big.

                  See Java. You learn that language at almost every university. All the companies learned you can get cheap labor right from University. It’s not about the language but about the economy that built around it. The same it’s true for many other hypes.

                3. 10

                  I’m surprised there’s nothing about smartphones and “everyone needs an app” on here.

                  1. 3

                    Good point, tablets might also go in the “minor” category.

                  2. 7

                    Very much agree but id say that this current AI cycle is specifically GenAI, to distinguish it from other AI hypes that I suspect will happen in the future.

                    1. 3

                      And AI hype cycles that happened in the past, like 5th Gen programming in the late 80’s

                    2. 5

                      Felt like there was a time when every tech company was obsessed with gamification too, somewhere around Social Media.

                      1. 4

                        I feel that VR might need a couple of appearances in the also-ran category. The first might fit in somewhere around 1990(?)-1995. Since 1992 gave us Lawnmower Man and the Taye Diggs in Rent from 1993 was a VR developer, so the hype cycle must have started a few years before that. I put the end at 1995 with the release of the Virtual Boy.

                        The resurrection goes from around 2012 (with the announcement of the Occulus) to either 2021 (Google Cardboard is Killed By Google™) or 2023 (Zuck announcing the pivot away from Metaverse after losing $10¹⁰). Obviously, VR continues chugging on and things are continuing to come out, but your tech company isn’t expected to have a VR strategy.

                        Another also-ran that might be worth mentioning is “local” or whatever we want to call what Groupon did. I mostly pull this out because I’ll never forget a prognosticator on the orange site declaring that Groupon would be buying out Google in ten years if Google doesn’t come up with a good solution for local businesses.

                        1. 5

                          It is the inevitable fate of every seemingly-magical technology to either flop or become boring. When I first used the internet, it seemed like absolute sorcery that a Danish teenager could be chatting in real time on the computer with someone from Singapore, or that I could grab a bunch of files from somewhere in the US without paying someone to send me a CD-ROM or a pile of floppies. Now, that’s Tuesday.

                          VR keeps flopping, I think, precisely because it can’t become boring. It’s not a matter of graphics fidelity or even of responsivity, the problem is that strapping something on my head that blocks my vision is an eerie and solipsistic experience that keeps viscerally freaking out the savannah ape in me. I’ve had fun with VR, I’ve had my mind blown with VR, but I’ve never had using VR feel “cozy” or even “comfortable”. And a lot of people seem to be the same way.

                          (Whenever I think about this aspect of technological development, I can’t help but think of my late grandmother’s life. She was born into a world where the one radio in her home village was a magical community gathering point (and going to the nearest town was a full-day ride in their horse-drawn wagon) - and she died in a world where the iPhone she owned was boring.)

                          1. 3

                            Tablets went through several false starts before the iPad. Now they’re still dubious outside their niches, but well established within them.

                            1. 2

                              I don’t disagree on principle (also shoutout to VRML in the late 90s) - but comparing to some others on the list the dimension doesn’t even really justify the name hype, not even the “also ran”, imho.

                              Or maybe we need to put 3-4 of those as megahype :(

                            2. 3

                              Minor bubble: laptop/tablet/phone convergent UIs. Ubuntu Touch was 2011 (so was GNOME 3.0). Windows 8, and the Metro design language, was 2012. According to Wikipedia where I stole this information from, Google announced Android apps on ChromeOS devices in 2014, though I don’t really see that as part of the hype cycle honestly - it feels more like a useful feature than a impossible-to-achieve convergence goal. (But, maybe that’s a good indication that the hype cycle technology became boring around 2013, 2014?)

                              1. 2

                                2015-2020 had a minor “serverless” bubble.

                                1. 2

                                  1999 - 2006: Java

                                  As you said, within 1 year - but I’d say 1998 - now, maybe esp. if you put Java/JVM - but yeah, the hype has ended

                                  2007 - 2010: The Cloud

                                  I don’t get this end date, even less than Java. People are talking about the cloud more than ever and yet not understanding more. The hype has not ended at all.

                                  2010 - 2015: Social media

                                  I would have started earlier. 2008 for Twitter? Also did it really end? Maybe it is commoditized, yes

                                  2012-2015: Internet of Things

                                  Hm. Again, I would put the start later and the end not yet clear. Can we maybe introduce “not hype, but still overused”?

                                  Ah. I guess I’m nitpicking, I mostly agree with you.

                                  1. 1

                                    I don’t get this end date, even less than Java. People are talking about the cloud more than ever and yet not understanding more. The hype has not ended at all.

                                    it’s rare to find companies launching products with no clear goal or value-add beyond having the words “cloud” smacked on them these days.

                                  2. 2

                                    You say on the article that “There really tends to be only one at a time though” but I think that’s not true, the industry is big enough that supports multiple technologies at the same. Which makes me question what is the scope of this “tech”, consumer technology, computery-stuff, relation to software? But neither of the definitions of tech satisfy all the examples you mentioned.

                                    This is because I want to share with you a real hype cycle that ended very bad at the time, but it was mostly managed by specific companies only: solar panels. In 2008 you could find many solar panel companies in Europe. It was seen as the industries of the future. In my city solar panel companies sponsored the local sports clubs. However, it was unsustainable, there was not enough demand at that time, solar panels were not that efficient, and electricity was still kind of cheap. There were many layoffs when this companies closed. Later they were replaced by Chinese makers which were cheaper.

                                    Now it’s a mature technology, each year we beat a new record of solar panels production but more than 80% of them are made in China, and China wants to regulate the production (kind of solar OPEC) to control the market and make it sustainable long term.

                                    1. 6

                                      Very cogent points! These things happen all over and all the time; I could tell similar stories about the various swings in oil prices being connected to expectations vs reality of new technologies and prospects (the Marcellus Shale gas fields being the one I know the most personally). This list in particular is often the things that touch me through the media/news I consume, so, tech and programmer-y stuff. In particular I see it often driven by the Silicon Valley startup and funding culture, which has insane amounts of money and a propensity to spend it in flashy ways that create lots of… well, hype, even among people who aren’t practitioners in the field.

                                      So I spend a year having to hear about bitcoin on the radio while driving to work, roll my eyes at the stupidity at it all and say “it will end mostly-badly because these people aren’t usually creating anything of new value”. Lo and behold it ends mostly-badly and the successful bits that are genuinely good ideas fade into the background of existence, and a year or two the tech industry is kinda fallow and nobody is really being all that creative… and suddenly I start hearing about the big AI boom and how it’s going to change the world! Just like blockchain, Big Data, IoT, etc etc all did. ie, lots of money will be wasted, lots of money will be siphoned from the have-not’s to the have’s, using the internet will get slightly more resource-heavy and painful, and we’ll grow a new and exciting sub-field for security researchers.

                                    2. 1

                                      I don’t know the exact time lines, but object-oriented everything was definitely a bubble that lasted quite a while (maybe that matches “java” in your list, but I feel like the hype was there way before Java).

                                      Also, “business people programming” (COBOL, SQL, etc)