Just drop in a simple <noscript> tag to explain the situation, or politely ask me to enable JS (Trello does this), or even berate me as in some whimsical cases.
Agreed; in most cases I don’t mind whitelisting your site (provided it’s not something like a blog) as long as you have the decency to apologize about it.
There’s a big gap between “my site should work without JS” and “my site has 20mb of JS of tracking, ads, animations, flashy things to render static text” that most people don’t take advantage of.
I don’t see this slowing down because every year people pick up frontend development and need to validate their and their team’s skills. Theoretically, if the web moved to “only a lot of JS when absolutely needed”, a lot of people would find their job redundant or not needed anymore (somewhat interesting but useless thought experiment).
Personally, I realized that I should move to backend development because I actively avoid the kind of apps I was writing as a frontend dev.
It is nearly impossible to criticize the zeitgeist, too.
The cycle is:
Alpha nerds: Software is eating the world! JS is everywhere! Open web!
Devs: Wow I should get on the JS train to remain relevant!
Devs (later): I’m having some trouble, but everyone’s using JS, I guess I need to use more!
It seems lobste.rs also requires JS to be able to participate (upvote and comment)
People don’t get mad at Google Maps, they get mad when some blog or CRUD app requires megabytes of JS (while feeling super slow) with no noticeable benefit.
A high-performance extension in NaCl maybe? Or a cross-platform app given its market share?
maybe serverside rendering?
But I think it would be much more difficult to maintain
As a compromise between blocking and allowing all JS, I use uMatrix. It lets me selectively load things (js, iframes, images, cookies.. and more) from sites all while blocking ads. When you stumble along sites that break, you can load things one at a time until it does work. Quite handy. Thanks to @zod000 for pointing it out to me!
That’s what I do with NoScript. One disadvantage and advantage is the time it takes to do that with some sites with piles of poorly labeled scripts. That it’s also an advantage is where convenience suddenly selects against sites forcing me to think too much on the scripts. Im more likely to close them in irritation.
… keeps the doctor away?
Holy shitballs folks.
I think this is some good further evidence for anybody who cares (apparently not the web dev community) that things have gotten totally out-of-control.
I dunno. Seems to me like the sites that don’t work without JS are mostly the sites that I don’t care about, because they are the sites focused on advertising and visual gimmickry rather than providing meaningful, informative content.
The bloated and buggy ad-driven web is indeed getting bigger and worse, but it’s not the only web. Maciej Cegłowski has written extensively on this. The Dillo browser project is built around it. Sites like indieweb.org help educate users who see the web as a place to provide meaningful content. The ‘good’ web is growing too, although maybe slower than the ‘bad’ web (for now).
Somewhat ironically, the trend toward JS-heavy SPAs actually makes it easier to remove the crap, because these JS clients are served by JSON based APIs, against which one can often build unofficial clients.
This! I use a JS whitelist, and all the websites that work amazingly without JS.. I have whitelisted, because there JS is just used to add some nice bits without putting tons of ads and bloat.
Well, except for Bloomberg, which loads 3 to 4x faster without JS, has no ads or ad-blocker-blocker, no autoplaying videos.. but has a 50px~ tall, blank header. I will gladly take that tradeoff.
Cares about what? There is no vague unified goal of people who care.
Using uMatrix is somehow the attempt of getting both performance and a working webexperience