1. 52
  1. 19

    I’m glad to see someone trying to correct the record (since these web3 people are really misrepresenting things) but Web 2.0 was very simply xhr and all the dynamic applications that could be built on that primitive. It was the shift from documents and forms (Mapquest) to applications (Google Maps).

    Web 1.0: go to Yahoo finance and see the latest stock prices. At best, someone would stick it in an iframe with an autorefresh meta tag. The browser chrome would show loading ui every 5 seconds.

    Web 2.0: go to a web application and see live updating stock prices. Scroll around and interact with the page. No full page reloads that reset things like scroll state and no browser loading indicators.

    I’m not sure why this era is so misunderstood.

    Nothing killed it, either. Most websites are still all Web 2.0 technology.

    1. 5

      I don’t disagree with your characterisation but it’s sad that the Google push to make ad blocking hard overruled the W3C push to make the web useful. Tim Berners-Lee and the W3C had a very different view of Web 2.0.

      Web 1.0: you go to Yahoo! Finance and get back and HTML document containing the latest stock prices with interleaved syntactic and semantic markup.

      Semantic Web 2.0: You go to Yahoo! Finance and talk to a web service that provides a structured XML representation of the stock prices and other information, then applies some XSLT to transform this into XHTML tables, SVG graphs, and so on. You can also issue an HTTP request to the same endpoint and get just the XML with the stock prices to display in another format, to integrate with other tooling, and so on.

      There is one big problem with this vision: adverts, editorial, and so on are clearly distinct from data in a machine-readable format. A web browser just needs to filter out the extraneous content. A non-browser UA doesn’t even fetch it and instead talks to the web service and renders the data in a completely different way.

      1. 14

        Google killed the semantic web by making really good search, not by exerting pressure on the w3c. The semantic doctypes just had no adoption because the effort to reward just wasn’t there. Microformats were a more pragmatic approach but also just didn’t get adoption.

        html5 was a wild success because it completely abandoned all strictness or semantic underpinnings and instead just offered new functionality.

        1. 5

          I’m half joking here, but: maybe the Semantic Web is due for a comeback, since Google search sucks now that SEO and content farms sabotaged it.

        2. 6

          Tim & co’s view of the web is a closed platform where no open source browser is possible thanks to DRM. Don’t pretend their ideals are any better than anyone else’s.

          1. 3

            what are you on about?

            1. 1

              What do you mean? I’m just pointing out that the W3C and Tim’a view of what the web should be is probably not what we should hold up, given their endorsement of DRM, which is literally the antithesis of any kind of open and interoperable web.

              1. 3

                it was a non-sequitur though. the above comment was about the semantic web… which is all open standards and readily implemented by “open source browsers”.

                1. 1

                  The comment was juxtaposing W3C and Berners-Lee’s vision for the web with Google’s vision for the web, strongly implying that W3C’s is better.

                  The technical details around the semantic web are fine, that’s not what I’m commenting about. Maybe the semantic web would have been a better web. But the W3C’s semantic web probably wouldn’t have been, since the W3C is committed to fighting against openness and interoperability.

                  1. 4

                    Two things can hold true at once. They supported DRM, and they also support open standards.

                    1. 4

                      Those things are mutually exclusive, sorry.

                      1. 2

                        You must be confusing the map with the territory.

                        1. 1

                          You have to state your argument more clearly. I don’t know what the map or the territory is in your analogy, or how I am confusing them.

                            1. 2

                              I haven’t tried to discredit the concept of the semantic web. I have tried to discredit the W3C and the idea that they’re trying to make the web useful.

                              1. 1

                                they’re trying to make the web useful.

                                Well, they are - if you’re in the business of paid streaming of encrypted media, or selling advertising. Otherwise? Not so much.

                      2. 2

                        They supported DRM, and they also support open standards.

                        That’s a surprisingly fine hair to split.

                        EME - the extensions to the browser that facilitate the use of DCM to decrypt DRM content - are indeed an open standard.

                        However the DCMs themselves are not (for obvious reasons), and most providers have been unwilling to release binary blobs for non-standard OSs and browsers. I’ve explicitly asked the Widevine team whether they’d release their DCM for FreeBSD and been told “lolnope”.


                        First, Google — whose proprietary technology must be licensed in most cases if you want to make a new browser — stopped permitting open source browsers to use its DRM technology, effectively requiring all new browsers to be proprietary.

                        Now, Microsoft and Apple — the remaining two vendors who can also supply the proprietary components that Google won’t license — have effectively stopped answering the phone when small browser creators call. Microsoft might let you license its tools if you pay them $10,000 to submit an application and then $0.35 for every browser you ship.

                        I encourage people wanting to invest time and effort into open source software (Free / Libre / BSD / whatever floats your particular boat) to look at Gemini and related technologies instead of the Web.

                        I’d long worried about Microsoft pulling an embrace, extend, extinguish on the Web but instead it was Google who had the ability to pull it off.

                        1. 3

                          Yes, but the point being that HTML5 and the Semantic Web are all built on open standards. The fact that some corruption led to DRM getting pushed through the committee does not invalidate all the other open standards. Namely, DRM does not have anything to do with the Semantic Web, and and to discredit it by association is a pretty empty argument, imo.

                          1. 1

                            Again, though, that’s not a helpful hair to split.

                            Sure, the existence of EME doesn’t invalidate the other open standards. But it’s now impossible to build a fully functional open source Web browser, because:

                            1. Google won’t let you use their CDMs in your browser.
                            2. Google won’t let you log in from your browser.

                            The Web has been entirely co-opted by Google for the purpose of delivering ads and encrypted streaming media.

                            The situation with the Web now is reminiscent of Microsoft’s dominance of desktop computing back in the 90s, even down to Microsoft funding Apple to preserve a semblance of visible competition - only now, it’s Google funding Mozilla.

                            1. 2

                              Another non-sequitur… one which I agree with, but it does not mean that the W3C wants a completely closed platform which is what @mort asserted above[0].

                              [0] https://lobste.rs/s/5mshh5/don_t_lie_me_about_web_2_0#c_jnh9sd

                              1. 1

                                What do you mean by “completely closed platform”? It feels like a “no true Scotsman” argument, to be perfectly frank.

                                I’d argue that the Web is a de-facto closed platform as only two billion-dollar corporations are capable of producing a Web browser (plus one charity largely funded by one of those corporations) in 2022.

                                In particular, open source browsers are being actively excluded from the Web by several of the larger players - especially one ad-tech company that just so happens to produce the most popular browser as well.

          2. 3

            Yes, this. Rich Internet Applications -Ajax (invented by Microsoft for the outlook web client) and the early js libs - scriptaculous, prototypes, then jquery and the backend apis.

            The static documents and limited html forms gave way to interactive “real time” applications.

          3. 8

            This is spot-on, and I was super into the Web 2.0 hype cycle and kinda crushed to see it steamrollered by Twitter/Facebook/YouTube. I just gotta rant about this one bit:

            Web 2.0 lost to siloed social media because: […]

            • Apple made a fateful decision that mobile-phone internet should be app-centric, not browser/website centric. Then Android copied their mistake.

            Apple initially pitched web-apps as the only way to create 3rd party apps for iPhone (go watch the keynotes of WWDC 2007.) Almost everyone hated this idea. Then Apple announced the iOS SDK in early 2008 and there was much rejoicing.

            Web apps are far from ideal for small mobile devices, and this was especially true with 2007-era CPUs and cell technology (3G was cutting edge and not supported on iPhone 1.) They’re slower and use more memory, and don’t cache well or work offline unless you do a bunch of extra work … even then your offline data is limited in size and the browser treats it like a cookie and will delete it at its whim even if it contains un-pushed user content.

            To this day mobile devs still have the option to create web apps, which can be saved as an icon on the Home Screen just like a native app; they just don’t, because native apps work better. Or else they use web tech packaged up as an app, like PhoneGap/Cordova, which fixes the web’s huge offline problem.

            Anyway, let me argue that native apps are in some ways more distributed than websites, because you don’t need your own server to host one, and there aren’t any scalability problems with running the app itself. That’s a huge deal for apps that don’t rely on shared user-created content, like games and reference works. Even for user-content-based apps, the server only needs to handle the content, not serving the app itself and it’s assets.

            1. 7

              This is an aside, but:

              Apple made a fateful decision that mobile-phone internet should be app-centric, not browser/website centric. Then Android copied their mistake.

              The irony is that this too is a rewriting of history, one which most people seem to have forgotten. When the iPhone was revealed, there was no app store, and Steve Jobs explicitly said they believed that the web was going to be the platform of the future for mobile.

              What happened was very simple: the web couldn’t hold a candle to native back then. The iPhone browser was only usable because of numerous webkit-only extensions, which website builders duly incorporated when they started making mobile-compatible and then mobile-first websites.

              Naturally there was also a huge incentive for mobile apps in the form of paid app stores. But it’s crucial to remember that this came after the initial release, and that it was highly welcomed. Even the mobile web-app-shells like Cordova that emerged to bridge the gap were poor knock-offs compared to native, and never felt right.

              The death of Flash is a similar tale. Android actually had a fully functional, working version of mobile Flash at one point. It was terrible, because unlike a web page, a Flash app was an arbitrary canvas and it was pretty much impossible to substitute e.g. text fields and dropdowns with touch-friendly alternatives, or e.g. make buttons tolerant to fat-finger presses or e.g. add sensible touch scrolling. The iPhone 2G needed a first-class Youtube app because Youtube used Flash to play video at the time, and trying to use that player on an Android Nexus 1 was completely ridiculous.

              It is very clear to me that Apple had learned a lot of lessons about touch phones before they revealed them to the public, and Android, in typical Google arrogance, took until version 3 / Honeycomb before they had even started to catch up, and version 4 to become a serious competitor.

              1. 5

                I was using the internet in the early-to-mid-2000s era before centralized social media got big, and I don’t remember anyone using the term “Web 2.0” to talk about PHPBB forums and blogs with comments, I only remember people using it to talk about Facebook and Twitter and Youtube. I actually remember rather distinctly one time in college (2009 or so) when I was talking to this guy who was trying to hire a “Web 2.0 manager” for a student group. I thought he wanted a web designer to make a website for the group, which was a skillset I had. But in the process of talking to him about the job, I learned that what he had in mind was basically a social media manager - someone who would write posts and respond to comments on nascent Twitter/Facebook/Youtube - which wasn’t a thing I wanted to do, and I didn’t end up working with that student group.

                But that’s just a debate over the boundaries of nomenclature. The eras of people interacting with the web this guy recognizes are real, regardless of which one you assign the label “Web 2.0” to.

                I’m way less hostile to cryptocurrencies than this guy is (quite the opposite, in fact), but I do think that a lot of the currently-media-hyped web3 technologies basically do have the problems outlined in this post, won’t actually make web-based social media meaningfully decentralized, and aren’t trying to do that or for the most part claiming to do that. People who talk about implementing Minecraft and Fortnite items as NFTs aren’t idealistic cypherpunks - if they were, they would be talking about how to make it possible to play these games in ways that Microsoft and Epic disapprove of - they’re entrepreneurs who are amoral on the topic of decentralization.

                There are ways in which smart contracts on Ethereum and similar blockchains can contribute to meaningful decentralization, but they’re not generally as mass-marketable as Fortnite NFTs, so they get talked about less in the media and contribute less to mindshare when people imagine what things constitute “web3”.

                1. 18

                  To me, Flickr is the quintessential “Web 2.0” example site: user-generated and user-organized content, RSS for easy integration back into your own site or for people to just subscribe for updates. And various factors led to Flickr absolutely getting its lunch east by Instagram, a service so hostile to the Web that I struggle to think of appropriately hyperbolic terms to describe it (see, for example, “link in bio”).

                  “Web3” appears to consist of a combination of “everything everywhere must be financialized in ways users can never ever escape” and codified dual-tier society (if you’re wealthy and/or well-connected and something goes wrong, the system will bail you out or reverse the bad thing for you; if you’re not wealthy and/or well-connected, “code is law” and whatever you lost is irreversibly lost).

                  1. 3

                    “Web3” appears to consist of a combination of “everything everywhere must be financialized in ways users can never ever escape” and codified dual-tier society

                    I think this describes the current state of Web 2.0, no?

                    Everyone screaming “if you’re not paying then you’re the product” for decades as a bizarre rallying cry, while even paying users are squeezed in every way possible–why would a for-profit company leave money on the table just because you’ve already given them some?

                    Twitter Checkmark users getting privileges to only see tweets from other people blessed with a checkmark, literally creating a dual-tier perception of reality.

                    Meanwhile all of your data is owned and resold by the intermediaries, the rules are changed as they see fit, open APIs are closed off, free services are slowly migrated to paid after competitors die out.

                    1. 3

                      I don’t really think of Twitter, which was originally an SMS service, as “Web 2.0”; I think of it as the sort of thing that arose as “Web 2.0” was being killed off.

                      Again, to me “Web 2.0” is mostly about user-generated content and organization, and offering the ability to integrate with other things and produce mashups via APIs, RSS, etc. In other words, things that were truly of the Web. It was a very brief period and then it was over So whenever you start on a “well what about Twitter/Facebook/Instagram/etc.” tangent, just know that I describe those as the things that came after and largely killed “Web 2.0”.

                      Meanwhile “Web3” financialization is so extreme that it often feels like they’d charge me for breathing if they could (and of course would do so in a unique-per-service token that has to be bought up-front), and ultimately everything ends up centralized onto a handful of big exchanges and intermediaries. Which probably doesn’t matter because “DeFi” seems to consist entirely of a bunch of entities all loaning tokens to each other and using those loans as “backing” to justify minting more tokens that they loan to each other to claim as “backing” to justify minting more tokens… in an inflationary cycle that inevitably pops when one of them crashes. As has been going on for a little while now.

                      1. 1

                        I don’t have much disagreement about the idealized version of “Web 2.0” as it was once upon a time in our minds (I was there too, I remember being very optimistic! I built many API mashups that were inevitably killed), but clearly that’s not what it is today, right? Or do we need a different label just so we can talk about the present state of things without tarnishing our nostalgia for Web 2.0?

                        It’s challenging talking to different groups about what Web2.0/Web3 mean to them, some people take the position that “Web 2.0 is just the good parts, and everything else just doesn’t fit under that label” while also taking the position of “Web3 is just the bad parts”, or the reverse.

                        I also feel that a lot of the conflict is that Thought Leaders (like cdixon) writing up a thesis on what Web3 is comes off as “we just stole all the best original ideals of The Internet/Web 2.0/etc and rebranded as Web3” – which is not entirely untrue.

                        1. 1

                          I have yet to see any real use case for “web3” other than “It’s got what VCs want! It’s got monetization!”

                          “Play to earn” turns out to be all the downsides of old-school gold farming and none of the upside of having an interesting game associated with it.

                          “DeFi” is just a game of A lending to B lending to C lending to A in the inflationary loop I already described.

                          And the whole thing keeps centralizing onto a handful of major players anyway.

                          Anyway, if you want to continue having a discussion about “web3” I’ll begin charging you per character to post comments to me.

                  2. 2

                    I don’t remember anyone using the term “Web 2.0” to talk about PHPBB forums and blogs with comments

                    Not those specifically, because those were using Web 1 tech (form POSTs and page reloading.) Web 2.0 was all about using JS, “dynamic HTML” (DOM manipulation) and XMLHTTPRequest. Discourse, for example, is an über-Web2 app even though it came about after the hype cycle.

                    The Web2 hype was not universal so I’m not surprised everyone didn’t see it the same way. Its epicenter was the ‘blogosphere’ [sorry] and the WIRED / O’Reilly / etc media scene.

                    1. 1

                      We all experienced the logos and the missing vowels, though.

                    2. 1

                      People who talk about implementing Minecraft and Fortnite items as NFTs aren’t idealistic cypherpunks - if they were, they would be talking about how to make it possible to play these games in ways that Microsoft and Epic disapprove of - they’re entrepreneurs who are amoral on the topic of decentralization.

                      I was ideating with my buddies the other day about how sweet it would be if you could sell your DLC when you were done. Obviously I doubt those NFT crypto-bros would ever dare…

                      It sucks how not only is Blockchain super lame/ugly, it’s also bereft of imagination or courage.

                    3. 2

                      Social media was an incentive to start killing Web 2.0, but it wasn’t the only thing that killed it. Interactive, high-quality media delivered by Flash and similar methods were a contributor. When the only thing you have to observe is someone else’s content, it has to be either entertaining or socially relevant. Social media nailed the latter, it didn’t nail the former (though Faceboo-cough sorry, Meta is certainly trying).

                      Newgrounds and similar sites had an explosion that lasted well beyond Web 2.0’s “expiry” date because there wasn’t a concerted effort to make something that was just socially rewarding: anybody can create things, and the simplest thing you could create was an animation, which was attractive to anybody that opened a paint program for fun.

                      The browser’s default state is a document viewer. That’s what it does. It does it really well. It’s also far from being what anybody wants out of “the web” as a place to “be” and create on. It’s no surprise why Flash took off.

                      If a hypothetical “Web 3.0” ever exists, it’s not going to take the form of a document viewer that we need to continue bolting interactive capabilities on to. It’s going to be a collaborative social free-form programming platform with incredibly easy entry and decentralized hosting over a shared compute pool.

                      …Not to give anybody business ideas… cough

                      1. 1

                        (I kinda think this article is off-topic for lobste.rs but whatever)

                        I think this article is right in identifying web 2.0 stuff. I think this article is incorrect in saying that it was killed.

                        • blogs with comment sections… still out there! Lots of blogs with relocated comment sections to HN/etc tho
                        • wikis…. every video game and their dog has one of these
                        • forums basically still exist, except it’s mostly reddit. But there’s such heterogeny between subreddits that you get what you want
                        • tagging systems… still exist
                        • Rails… still exists
                        • AJAX-y stuff… still exists
                        • RSS still exists!

                        RSS is a particulary funny one cuz basically everyone whining about RSS being killed just is complaining about Google Reader. The Old Reader is more or less a perfect clone of that, if that is what you missed. Every site I followed in the past still offers RSS feeds.

                        I do think that forums are tough, cuz you have reddit and friends, but a lot of stuff is moving over to places like discord, which is a fundamentally different model. If there’s one that’s “in danger”, I think that one is. But stuff exists and is out there and is still perfectly usable. Sorry there’s no RSS feed for tweets.