Hi HN – I’m one of the authors of this proposal. I’d like to clarify a few points here:
Wikipedia is not becoming an SPA
Wikipedia is not dropping support for non-js users
This proposal is not about changing our current browser-support matrix[1] (which includes IE11 as a first-class target; Vue.js ecosystem still supports IE 11)
This proposal is about changing the way we develop enhanced, JS-only features across our projects; many such features exist already, but they are written in jQuery and an in-house framework called OOUI
These features will continue to be delivered in a progressively-enhanced way on top of the PHP-rendered baseline for the forseeable future. We are interested in how server-side rendering of JS components can integrate with this but we’re still looking into how this might work
We will continue to prioritize accessibility, internationalization, and performance in everything we ship
These features will continue to be delivered in a progressively-enhanced way on top of the PHP-rendered baseline for the forseeable future.
I think that’s a much bigger problem.
The WikiMedia codebase is a Lovecraftian fever dream of 15 years of accumulated technical debt, written in PHP.
It’s like everything is barely held together with bubblegum that’s now starting to decompose; and they are not able to change anything, because all the hacks, bots and workarounds invented to deal with WikiMedia’s limitations would stop working the minute they did that.
Basically: fixing WikiMedia would mean things would get far worse in the short-/mid-term, and no one is willing to pay the price for long-term improvements as the situation deteriorates further and further.
The MediaWiki History page is a pretty interesting read. It’s worth reading in full, but the tl;dr is that Wikipedia started as a small experiment with a flat file database, became much more popular than expected, and developers spent a few years fire-fighting to keep up with performance/scaling demands.
This quote probably sums it up quite well:
Despite the improvements from the PHP script and database back-end, the combination of increasing traffic, expensive features and limited hardware continued to cause performance issues on Wikipedia. In 2002, Lee Daniel Crocker rewrote the code again, calling the new software “Phase III”. Because the site was experiencing frequent difficulties, Lee thought there “wasn’t much time to sit down and properly architect and develop a solution”, so he “just reorganized the existing architecture for better performance and hacked all the code”. Profiling features were added to track down slow functions.
In early 2003, developers discussed whether they should properly re-engineer and re-architect the software from scratch, before the fire-fighting became unmanageable, or continue to tweak and improve the existing code base. They chose the latter solution, mostly because most developers were sufficiently happy with the code base, and confident enough that further iterative improvements would be enough to keep up with the growth of the site.
I’m not sure if I would have done better in the same situation to be honest.
I’m not sure if I would have done better in the same situation to be honest.
Agreed, as someone who has spent a good chunk of his career rewriting legacy software, the rewrite never goes as smoothly as you think it will. Maybe a rewrite in 2003 would have been great, but it also could have turned out horribly and Wikipedia might have even gone defunct (who knows).
mostly because most developers were sufficiently happy with the code base, and confident enough that further iterative improvements would be enough to keep up with the growth of the site
That sounds like the history of PHP.
I’m not sure if I would have done better in the same situation to be honest.
Given the amount of money the foundation has available, I can’t imagine a worse outcome. The state of WIkimedia makes the BER look like a perfectly managed project.
Can you provide some examples ?! You seem to be biased about php code in general and comparing a working wikipedia with BER, which isn’t finished even today nor working.
Well, they didn’t have a lot of money in 2002/2003 ($80k in 2003-2004), and it would be challenging to rewrite it all with even thousands of existing pages (never mind hundreds of thousands or millions) since you either need to maintain compatibility with the (organically grown) syntax or 100% reliably transform that to something else.
In hindsight, perhaps the biggest mistake they made was making MediaWiki generic Wiki software, instead of just “the software that powers Wikipedia”. I completely understand why they did that, but it does make certain things a lot harder.
Either way, I don’t envy the position MediaWiki developers are in, and have been in for the last ~20 years.
Do you know this because you’ve actually hacked on the code or are you guessing? Because this sounds like something anyone could say about any 15 year old code base, while that code base could actually be a reasonable state.
Bit late, but to be clear: this is not my comment; I am not the commenter who wrote this on HN. I simply copied it here, because it answered the question.
… is there a synonym for ‘progressive web app’ that we can use? I would really like to reserve progressive, in a web context, for progressive enhancement, but I lack an alternative term to use for these offline-tolerant web apps.
For the unfamiliar: progressive enhancement is when you design the HTML to work on its own, and then enhance it with CSS and Javascript. The effect is that the page still works without JavaScript (necessary for low-end devices), or without JS and CSS (necessary for screen readers and programs).
the page still works without JavaScript (necessary for low-end devices), or without JS and CSS (necessary for screen readers and programs
Pretty much all screen readers can deal with CSS and JavaScript, and have for many years. And low-end devices can deal fine with JavaScript as long as it’s reasonable. Actually, a click event to load some data dynamically is less resource intensive than a full page refresh in most cases.
You’re right, I think, that Javascript is not always the screen reader / accessibility problem it once was; but AFAIK it’s still capable of overwhelming slow devices, especially the copious gunk in aaaadvertisements, which also triggers lots of requests, which doesn’t help. In that way, surfing the web with JS turned off can still be a huge battery saver for many, even though a few well-designed sites use JS to save the user’s battery. (NB: am not an expert on accessibility, and have no experience of depending on it. Take my opinions with grains of salt.)
For the screen readers I wasn’t so much thinking that they can’t handle CSS, but more that the (ordering of the) content should alread make sense as it appears in the HTML, without the CSS’s layout.
Click events to load some data dynamically can indeed be an improvement; but you can have that and progressive enhancement, and indeed most commonly find such unobtrusive Javascript on progressive pages.
There is also a category of non-progressive pages that do not work with Javascript turned off; and usually that is because they use JS for a lot more than on-click loading of data.
Sure, excessive or low performing JavaScript is an issue, but that doesn’t mean all JavaScript is. The thing is that building complex-ish web applications that work 100% without JavaScript quickly becomes very cumbersome since you’ll be duplicating a lot of code in the backend; code which usually doesn’t get tested that well as may not even work correct.
For example the application I’m currently developing worked 100% without JavaScript in the first versions, but eventually let that go as it just became too much effort with very little return. The JS is about 800 lines of fairly straight-forward code, and I think getting an entire application in return for that isn’t too bad actually. The alternative is a desktop application with tens of thousands of lines of code.
Annnnd it turns out that the ‘progressive’ in PWA stands for ‘progressive enhancement’! (1,
2,
3, search for ‘enhance’). But on the other hand I’m not at all sure how progressive your web app is if it requires at minimum a service worker — that feels like giving developers license to skip the ‘make the essentials work with HTML + forms + a server’ step.
Anyway, I’m getting upvotes but no suggestions, so: please suggest synonyms!
If you have a Progressive Web Page (in the “progressive enhancement” sense), you can add a service worker (as one more step along the progression) to make it a Progressive Web App. Like any other progressive enhancement, it’s OK for the “app” part to require JS as long as without JS it regresses to an ordinary web page, and not to a pile of useless bytes.
Only if my client supports and has JS enabled, which was my argument :)
I think you’re saying that running an installed PWA (i.e. if service workers was not a requirement of installing it), doesn’t need JS to function? To me, that sounds like a saved HTML page (possibly with static assets), as in “what you get from ctrl+s in a desktop browser”. I don’t think that fulfills either the W or the A in PWA.
Imagine a hypothetical to-do list service, implemented as a PWA.
If a client has no JS, you can just type to-do items and click Submit to store them on the server, or tick a box and Submit to clear them.
If a client has basic JS, you can create and complete items in real-time and those actions will be sent to the server asynchronously in the background, without reloading the page.
If a client support service workers and the full nine yards, if you’re offline it also caches the changes locally and automatically syncs them back to the server, and lets you install it to your home screen as an app, etc. etc.
If a particular client doesn’t support JS, that particular client won’t be able to install the page as an app and run it offline. But that’s OK: if that client can still use the website, and other, more featureful clients can use it as an app, that makes it a Progressive Web App.
I always thought they should put the money into professionals making textbooks. Then, they sell them plus the course materials to colleges (esp community colleges). They start with general education just to make sure the books cost about nothing to students. They’re online for free, too, either immediately or after a time period. Then, they move into subjects like business, comp sci, etc. Gradually, we get a professional version of Wikipedia for both personal learning and career advancement.
I looked at them a long time ago. I can’t remember if they’re of the quality that business professionals and colleges would buy to replace existing resources. If they are, it’s a just marketing problem. If not, then it wouldn’t be what I was aiming for.
I agree with this, but I also think javascript can be used to improve the user experience. I would love to see a modern Wikipedia and would imagine the legacy site sticking around as well.
Core reading and editing functionality should be left alone for now. A good test-case feature would be one that provides an enhancement to functionality that has a more basic, no-JS fallback.
I really hope JS will not be mandatory anytime to be able to read wikipedia.
Comment copied from HN:
I think that’s a much bigger problem.
The WikiMedia codebase is a Lovecraftian fever dream of 15 years of accumulated technical debt, written in PHP.
It’s like everything is barely held together with bubblegum that’s now starting to decompose; and they are not able to change anything, because all the hacks, bots and workarounds invented to deal with WikiMedia’s limitations would stop working the minute they did that.
Basically: fixing WikiMedia would mean things would get far worse in the short-/mid-term, and no one is willing to pay the price for long-term improvements as the situation deteriorates further and further.
The MediaWiki History page is a pretty interesting read. It’s worth reading in full, but the tl;dr is that Wikipedia started as a small experiment with a flat file database, became much more popular than expected, and developers spent a few years fire-fighting to keep up with performance/scaling demands.
This quote probably sums it up quite well:
I’m not sure if I would have done better in the same situation to be honest.
Agreed, as someone who has spent a good chunk of his career rewriting legacy software, the rewrite never goes as smoothly as you think it will. Maybe a rewrite in 2003 would have been great, but it also could have turned out horribly and Wikipedia might have even gone defunct (who knows).
That sounds like the history of PHP.
Given the amount of money the foundation has available, I can’t imagine a worse outcome. The state of WIkimedia makes the BER look like a perfectly managed project.
Can you provide some examples ?! You seem to be biased about php code in general and comparing a working wikipedia with BER, which isn’t finished even today nor working.
Well, they didn’t have a lot of money in 2002/2003 ($80k in 2003-2004), and it would be challenging to rewrite it all with even thousands of existing pages (never mind hundreds of thousands or millions) since you either need to maintain compatibility with the (organically grown) syntax or 100% reliably transform that to something else.
In hindsight, perhaps the biggest mistake they made was making MediaWiki generic Wiki software, instead of just “the software that powers Wikipedia”. I completely understand why they did that, but it does make certain things a lot harder.
Either way, I don’t envy the position MediaWiki developers are in, and have been in for the last ~20 years.
Do you know this because you’ve actually hacked on the code or are you guessing? Because this sounds like something anyone could say about any 15 year old code base, while that code base could actually be a reasonable state.
Bit late, but to be clear: this is not my comment; I am not the commenter who wrote this on HN. I simply copied it here, because it answered the question.
hopefully it doesn’t become a progressive web app, with all of those cpu and memory-intensive pages that these ui frameworks create.
… is there a synonym for ‘progressive web app’ that we can use? I would really like to reserve progressive, in a web context, for progressive enhancement, but I lack an alternative term to use for these offline-tolerant web apps.
For the unfamiliar: progressive enhancement is when you design the HTML to work on its own, and then enhance it with CSS and Javascript. The effect is that the page still works without JavaScript (necessary for low-end devices), or without JS and CSS (necessary for screen readers and programs).
Pretty much all screen readers can deal with CSS and JavaScript, and have for many years. And low-end devices can deal fine with JavaScript as long as it’s reasonable. Actually, a click event to load some data dynamically is less resource intensive than a full page refresh in most cases.
Thoughtful points, thank you for that.
You’re right, I think, that Javascript is not always the screen reader / accessibility problem it once was; but AFAIK it’s still capable of overwhelming slow devices, especially the copious gunk in aaaadvertisements, which also triggers lots of requests, which doesn’t help. In that way, surfing the web with JS turned off can still be a huge battery saver for many, even though a few well-designed sites use JS to save the user’s battery. (NB: am not an expert on accessibility, and have no experience of depending on it. Take my opinions with grains of salt.)
For the screen readers I wasn’t so much thinking that they can’t handle CSS, but more that the (ordering of the) content should alread make sense as it appears in the HTML, without the CSS’s layout.
Click events to load some data dynamically can indeed be an improvement; but you can have that and progressive enhancement, and indeed most commonly find such unobtrusive Javascript on progressive pages.
There is also a category of non-progressive pages that do not work with Javascript turned off; and usually that is because they use JS for a lot more than on-click loading of data.
Sure, excessive or low performing JavaScript is an issue, but that doesn’t mean all JavaScript is. The thing is that building complex-ish web applications that work 100% without JavaScript quickly becomes very cumbersome since you’ll be duplicating a lot of code in the backend; code which usually doesn’t get tested that well as may not even work correct.
For example the application I’m currently developing worked 100% without JavaScript in the first versions, but eventually let that go as it just became too much effort with very little return. The JS is about 800 lines of fairly straight-forward code, and I think getting an entire application in return for that isn’t too bad actually. The alternative is a desktop application with tens of thousands of lines of code.
Annnnd it turns out that the ‘progressive’ in PWA stands for ‘progressive enhancement’! (1, 2, 3, search for ‘enhance’). But on the other hand I’m not at all sure how progressive your web app is if it requires at minimum a service worker — that feels like giving developers license to skip the ‘make the essentials work with HTML + forms + a server’ step.
Anyway, I’m getting upvotes but no suggestions, so: please suggest synonyms!
If your app doesn’t work without JS it’s not a PWA.
Single Page App is a term I see people use.
Can you point to sources for that claim? (edit: or just say what to use instead of service workers :) )
says otherwise, there are sources pointing at both Mozilla and Google for that quote.
If you have a Progressive Web Page (in the “progressive enhancement” sense), you can add a service worker (as one more step along the progression) to make it a Progressive Web App. Like any other progressive enhancement, it’s OK for the “app” part to require JS as long as without JS it regresses to an ordinary web page, and not to a pile of useless bytes.
I’m not sure I follow.
Only if my client supports and has JS enabled, which was my argument :)
I think you’re saying that running an installed PWA (i.e. if service workers was not a requirement of installing it), doesn’t need JS to function? To me, that sounds like a saved HTML page (possibly with static assets), as in “what you get from ctrl+s in a desktop browser”. I don’t think that fulfills either the W or the A in PWA.
Imagine a hypothetical to-do list service, implemented as a PWA.
If a client has no JS, you can just type to-do items and click Submit to store them on the server, or tick a box and Submit to clear them.
If a client has basic JS, you can create and complete items in real-time and those actions will be sent to the server asynchronously in the background, without reloading the page.
If a client support service workers and the full nine yards, if you’re offline it also caches the changes locally and automatically syncs them back to the server, and lets you install it to your home screen as an app, etc. etc.
If a particular client doesn’t support JS, that particular client won’t be able to install the page as an app and run it offline. But that’s OK: if that client can still use the website, and other, more featureful clients can use it as an app, that makes it a Progressive Web App.
What else would they (wikimedia) do with more money than they know how to spend?
I always thought they should put the money into professionals making textbooks. Then, they sell them plus the course materials to colleges (esp community colleges). They start with general education just to make sure the books cost about nothing to students. They’re online for free, too, either immediately or after a time period. Then, they move into subjects like business, comp sci, etc. Gradually, we get a professional version of Wikipedia for both personal learning and career advancement.
So, something like Wikibooks?
I looked at them a long time ago. I can’t remember if they’re of the quality that business professionals and colleges would buy to replace existing resources. If they are, it’s a just marketing problem. If not, then it wouldn’t be what I was aiming for.
Did wikimedia fall into a lot more money recently?
Wikipedia gets more and more money every year: https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_Cancer
Invest it and stop begging so much.
This comment seems to indicate they’re serious about having server-side rendering, so it looks like that’s not a concern.
Vue.js is pretty modular and has support for server-side rendering
I agree with this, but I also think javascript can be used to improve the user experience. I would love to see a modern Wikipedia and would imagine the legacy site sticking around as well.
Twitter thread from vuejs addressing assumptions.
This will be a disaster, despite the assurances, and you know it.
It’s going to have a sticky navigation bar.