1. 34
    The NoJS Club show web nojs.club
    Building noJS.club devops web goel.io
  1.  

  2. 29

    I’m pretty prejudiced against Javascript, but even I think this is going a little far. I want to make sure that my websites all work without Javascript, and I test with Javascript disabled in my browser, but I don’t think it’s wrong to add some features with progressive enhancement. Minimal and optional Javascript is great, Javascript abolition is unnecessary.

    1. 7

      Unless the site’s text changed, there’s nothing there about abolishing Javascript.

      But I think I can make a case for it. If I enable JavaScript, I’m giving you the ability to do almost anything with the computing power of my laptop. You can mine bitcoin, track my cursor, and try to exploit whatever security holes Doubleclick has left (or built) in my browser.

      Maybe you won’t, but I have to trust that you won’t. And I can’t, because we’re strangers. If you were truly trustworthy, would you ask for this kind of access? Especially if your site does nothing that requires it?

      1. 2
        1. 1

          Well, it was only a matter of time before Scott McNealy’s dream emerges, fully formed, from Javascript’s brow. Although I’ve seen far more articulate arguments in other places.

        2. 1

          You must have a very difficult time of it in life because interaction with any person or company implies (some degree of) trust. This doesn’t mean you don’t verify occasionally, but you should probably not visit any site you don’t trust. And if you worry about some dev counting mouse move events in your browser, I don’t see how you get anything done…

          1. 9

            You do realize this very site (lobste.rs) is dedicated to linking to random - or at least unknown - websites, right? We visit sites we don’t trust all the time. What you do trust is the execution environment of the web browser (brought to you by Doubleclick) to protect you from the worst abuses.

            The history of web standards is filled with examples of innocuous features that, unexpectedly, turned out to allow people to steal your credentials, harvest your browser history, or track you across many different sites. Browsers consist of hasty patches upon hasty patches to deal with these issues.

            Javascript (and its associated API) increase that attack surface a thousandfold.

        3. 2

          Yes! I for example use JS on my site to offer a feature that reads out loud the text.

          1. 1

            There are good use cases for JavaScript like maps.

            There are cases where we could extend HTML but right now JavaScript is fine. I’m thinking about upvote buttons on lobsters for example. We did for videos but they still all use JavaScript on top.

            Still, for reading articles, news, or blogs I don’t see the necessity for JavaScript. Some have nice flourishing like theme changes but that should degrade gracefully.

            I use the NoScript plugin too have Javascript mostly disabled.

          2. 15

            If the site is about no JS, then what’s the point in having a bar that shows the page size? Is the size of the page somehow relevant to sites not running JS?

            I get that this is a direct copy of sites like 1mb, 512kb and 200kb club, but I don’t get the focus on page size for this application. Surely there’s a better measure to add on the page?

            1. 2

              What’s a better measure? I’m open to suggestions.

              1. 7

                NoJS vs JS is a binary distinction, there is nothing to measure other than presence or absence.

                1. 2

                  Sure. That’s The criteria for being added to the list the page size is an objective measure to stack rank them. It could have been alphabetical but size is a bit more substantial.

                  1. 3

                    A minimal or lightweight site could have more data per page in principle, if it has more content. But in practice heavier pages have more cosmetic BS, not more content, which I think is what you were hoping to get at.

                2. 6

                  That’s kinda my point…there’s no way to measure a negative, so it seems counterintuitive to measure something completely unrelated just for the sake of having “a measurement”.

                  1. 1

                    I disagree

                  2. 4

                    How about full site loading time with a browser? I think considering that people usually dislike JS in that it makes sites feel slow, this would be a more useful measure.

                    1. 1

                      Good idea. I’ll investigate how easy it is to run phantom or cdp in GitHub actions.

                      1. 5

                        I also agree with the other comments here arguing that there are ways to judiciously use JS… however, JS usage is definitely overall ridiculous and out of hand and so I can’t resist pointing out that a particularly ironic metric to look at here would be Time to First Meaningful Paint, which AFAIK basically only exists because of dumb JS tomfoolery “booting” pages and filling in all the content client-side.

                        Maybe you could even have a second version of the page that would add (with red bars for “bad” or something) a few JS heavy sites to demonstrate just how dramatically improved TTFMP is on the no-JS sites?

                3. 29

                  Next up, clubs considered harmful

                  1. 10

                    How do I join that club?

                    1. 2

                      clubs considered GOOD, but when you open it up, it’s clubs you hit somebody with.

                    2. 6

                      Is this the new version of a webring?

                      I don’t get the fully anti-JS sentiment. My own personal site has a sprinkling of JS so you can change the theme. It still has a really high score on the GTMetrix scanner they say to use. It also does nothing special other than capturing your t keypress and changing the theme in a cycle. SO DANGEROUS! https://nickjurista.com if you want me to steal your identity with my scary JavaScript.

                      1. 4

                        Sure today you only capture t, tomorrow you capture all scroll events, then you prevent device-native shortcuts.

                        1. 3

                          This seems to be the slippery slope fallacy.

                          “A slippery slope argument, in logic, critical thinking, political rhetoric, and caselaw, is often viewed as a logical fallacy in which a party asserts that a relatively small first step leads to a chain of related events culminating in some significant effect”

                          1. 2

                            Well, the slippery slope fallacy isn’t a fallacy if it actually is a slippery slope… Question is whether it is here.

                            1. 1

                              I’m not sure I understand haha. This fallacy is one I’ve always had trouble understanding.

                              The way I see it,it’s insufficient to say a happened so b must be the next step. So in this case b bad so we assume b is going to happen and therefore we should stop a.

                              From a logical point of view this is purely speculation? You can certainly speculate based on patterns, but I think it weakens the reasoning of the argument?

                              Let me know what your thoughts are

                          2. 2

                            Hey, stop giving away my identity theft secrets!

                          3. 1

                            Users generally have no way of knowing that capturing t and switching the theme is the only thing your site does until it’s too late. Javascript isn’t vetted by distro maintainers either.

                            Your website could instead use a CSS media query to set the user-preferred theme: @media (prefers-color-scheme: dark) {...}. That way, users automatically get their preferred theme without having to execute untrusted code. They could also use the same browser/OS dark/light toggle that works on every other website instead of learning your site’s specific implementation.

                            I’m generally a big advocate of leaving presentation up to the user agent rather than the author when it’s possible; textual websites are the main example that comes to mind. There’s a previous discussion on an article I wrote on the subject; the comments had a lot of good points supporting and opposing the idea.

                            1. 1

                              I think giving a user presentation options is fine for text, but I really don’t care how someone wants to look at my personal site. The themes I use are not light vs dark, they’re a variety of color schemes.

                              I also don’t care if someone disables JS in their browser. IMO it’s extremist behavior from a very, very small fraction of people. My site works without JS because it has nothing interactive anyway. But many sites I’ve worked on have been entirely JS-based (like live updating sports and customer dashboards). There’s anything inherently wrong with JS.

                              1. 1

                                My site works without JS

                                It’s great that your site works perfectly without JS; thanks for sticking to progressive enhancement!

                                it’s extremist behavior…There’s anything inherently wrong with JS.

                                There are many good, non-“extremist” reasons why people don’t run JS:

                                • They use Tor. Running JS on Tor is a bad idea because it opens the floodgates to fingerprinting; frequent users generally set the security slider to “max” and disable all scripting.
                                • They have a high rate of packet loss and didn’t load anything besides HTML. This is common if they’re in a train, on hotel wi-fi, using 3g, switching between networks, etc.
                                • They use a browser that you didn’t test with. Several article-extraction programs and services don’t execute JS, for instance.

                                HTML, CSS, JS, Websockets, WebGL, Web Bluetooth API…there are a lot of features that websites/webapps can use. Each feature you add costs a few edge-cases.

                                It’s unrealistic to expect devs like you and me to test their personal sites in Netsurf, Dillo, braille readers, a browser that won’t be invented until the year 2040, e-readers, a Blackberry 10, and every other edge-case under the sun (I try to anyway, but I don’t expect everyone to do the same). But the fewer features a site uses, the more unknown edge-cases will be automatically supported. For example, my site worked on lynx, links, elinks, w3m, Readability, Pocket, and even my own custom hacky website-to-markdown-article script without any work because it just uses simple HTML and (optional) CSS.

                                Not all websites are the same. Customer dashboards probably need to do more things than our blogs. That’s why I like to stick to a rule of thumb: “meet your requirements using the fewest features possible” (i.e., use progressive enhancement). Use JS if it’s the only way to do so.

                                from a very, very small fraction of people.

                                I disagree with the mentality of ignoring small minorities; I try to cater to the largest surface possible without compromising security, and regularly check my access logs for new user-agents that I can test with. Everyone is part of a minority at some point, and spending the extra effort to be inclusive is only going to make the Web a better place.

                                I’d like to add that when making moral arguments, non-adherents tend to feel attacked; please don’t feel like I’m “targeting” you in any way. Your site is great, especially since it works without JS. Don’t let my subjective definition of “perfect” be the enemy of “good”.

                                1. 2

                                  I don’t feel attacked at all. This is no different to me than someone who refuses to use an Android or iOS phone because they are afraid of being tracked. I see it as a lot of tinfoil with very little substance.

                                  I personally do not see coding as a moral or political stance like so many do (especially here on Lobsters). I see it as a means to an end – and in my case it’s that I couldn’t decide on a theme and wanted to put an easter egg on my site.

                                  For professional things, I tend to follow the 80/20 or 90/10 when approaching projects, catering to the low-hanging fruit to get the most stuff done. If I focus on all edge cases, I’ll never finish anything and it’s unreasonable to expect anyone to really do that.

                                  Many sites don’t need to use JS and thus shouldn’t, but I think it’s throwing the baby out with the bathwater when people try to go “No JS” because of some sites doing stupid things to try to track users more or get more data out of them - or just turn their whole static site into a client-side app for no discernible reason. If you want total privacy, throw out your electronic devices altogether, start using cash only for purchases, get off the grid altogether.

                                  JS itself is amazing and has propelled the web to incredible new uses. What I see from a lot of these No JS people is a really small segment of generally power users who either don’t like JS to begin with or are incredibly paranoid about being tracked for whatever reason. The average user, and most users by a large margin, are not concerned with running some arbitrary scripts (which the sandbox keeps getting tighter over time btw). This club feels like more virtue signaling than anything to me, and I think the No JS argument and “club” is silly altogether.

                                  1. 1

                                    (Preface: nothing I have said so far applies to software that is, by necessity, a web app)

                                    I […] wanted to put an easter egg on my site

                                    Easter eggs are fine! Your site is great. You might want to change the trigger, though; people might expect something else to happen when they press “t”. Technically-inclined users are more likely than the average user to use custom keybinds.

                                    If I focus on all edge cases, I’ll never finish anything and it’s unreasonable to expect anyone to really do that.

                                    I agree that it’s ridiculous to expect people to test every edge case, which is why I advocated for simple sites that use simple technologies. With the “textual websites” I described in my article, you automatically get support for everything from braille readers to HTML-parsing article-extraction programs, without doing any work because you’re just using HTML with progressive, optional CSS/JS. I literally didn’t spend a single moment optimizing my site for w3m, lynx, links, elinks, IE, etc; when I tested my site in them, it just worked.

                                    I think it’s throwing the baby out with the bathwater when people try to go “No JS” because of some sites doing stupid things to try to track users more or get more data out of them.

                                    Nobody knows what lies on the other side of a hyperlink. We don’t know whether a site will do those bad things, so we disable scripts by default and enable them if we can be convinced. “Minimizing tracking and fingerprinting” and “living in a cabin in the woods” are worlds apart. I don’t think it’s healthy to assume that all privacy advocates are anarcho-primitivists.

                                    Disabling scripting for privacy isn’t uncommon; it’s the norm among Tor users. These people aren’t unhinged as you portrayed; they’re…normal people who use Tor. Their use cases aren’t invalid.

                                    JS itself is amazing and has propelled the web to incredible new uses

                                    Apps are new. Blogs are not new. We should use the right tool for the right job. The mentality of “progress + innovation at full speed” is great when used in the right places, but I don’t think it belongs everywhere. We should be aware of the consequences of using tools and use them appropriately.

                                    This club feels like more virtue signaling than anything to me, and I think the No JS argument and “club” is silly altogether.

                                    It is virtue signalling. We believe in and follow a virtue, and signal it to others by joining this club. The existence of this “virtue-signalling platform” can help encourage this behavior; I know for a fact that the various “clubs” that cropped up in the past week have encouraged many site authors to optimize their websites so they could be included.

                                    1. 1

                                      “Minimizing tracking and fingerprinting” and “living in a cabin in the woods” are worlds apart. I don’t think it’s healthy to assume that all privacy advocates are anarcho-primitivists.

                                      I never said anything about living in a cabin in the woods or “anarcho-primitivists” - in fact this is the first time I’ve even heard the term.

                                      You can minimize tracking and fingerprinting without disallowing JS altogether or starting a webring for sites without JS. That’s why I said “throwing the baby out with the bathwater.” If you remember, companies used to track with a pixel that folks would throw on their page which would then load from that domain and they would scrape whatever info they wanted on you. So when do we get to join the NoImages.club?

                                      1. 1

                                        If you want total privacy, throw out your electronic devices altogether, start using cash only for purchases, get off the grid altogether.

                                        I never said anything about living in a cabin in the woods or “anarcho-primitivists” - in fact this is the first time I’ve even heard the term.

                                        Sorry, that’s the vibe I got from living “off the grid” without any electricity. Guess I was a bit hyperbolic.

                                        If you remember, companies used to track with a pixel that folks would throw on their page which would then load from that domain and they would scrape whatever info they wanted on you.

                                        That’s a good reason to test your site without images, in case users disable them. More on this below.

                                        There’s a big difference between logging the loading of a tracking pixel and tracking the canvas fingerprint, window size, scrolling behavior, typing rate, mouse movements, rendering quirks, etc. Defending against every fingerprinting mechanism without blocking JS sounds harder than just blocking it by default. There’s a reason why the Tor browser’s secure mode disables scripting (among other things) and why almost every Tor user does this; they’re not all just collectively holding the same misconception. The loading of an image without JS isn’t enough to make you unique, but executing arbitrary scripts certainly is; the equivalent of a “read” receipt isn’t the same as fingerprinting.

                                        So when do we get to join the NoImages.club?

                                        Unless there isn’t an alternative, images should be treated like CSS: an optional progressive enhancement that may or may not get loaded. That’s why all images should have alt-text and pages should be tested with and without image loading. Writing good alt-text is important not just for screen/braille-readers, but also for people struggling with packet-loss or using unconventional clients.

                                        IMO, text-oriented websites should only inline images (with alt-text, ofc) if they add to the content, and shouldn’t be used simply for decoration. I wouldn’t create a “no-images.club” because the potential for misuse isn’t nearly on the same level.

                          4. 16

                            Going to register notonaclubsite.club, where you can only get your site added if you don’t already appear on any of these backslapping sites

                            1. 13

                              Doesn’t this lead to Russel’s paradox? ;)

                              1. 7

                                As endorsed by Groucho Marx!

                                1. 4

                                  I wouldn’t go as far, but yeah - some curation standard (“why is this one worth showcasing?”) would help a lot.

                                  Like, i could add my blog, but there’s really nothing interesting about it.

                                  1. 3

                                    Please do, I think that would be pretty hilarious.

                                    I quite like these club sites. Like another commenter said, it’s just some “fun”. I certainly don’t look at them as anything but.

                                    1. 2

                                      Wouldn’t that just be the Alexa top 1000 list?

                                      1. 1

                                        I reject the argument that there is no middle ground between “big bad web” and these circle jerk sites

                                        1. 4

                                          …says somebody in the gopherverse.

                                          1. 2

                                            Purity tests are awesome and the world definitely needs more of them.

                                            1. 1

                                              That’s a false dichotomy for sure, but thanks for your opinion

                                              1. 10

                                                There are people using Gopher having fun. There are people here making goofy little club websites having fun. You’re both having fun–no need to call what they’re doing a circle jerk.

                                            2. 1

                                              That’s not my argument.

                                              1. 1

                                                You replied to my comment about a site whose entries don’t appear on the assortment of existing .club sites by saying that it would only be the Alexa top 1000 on there.

                                        2. 5

                                          Just to give some honest positivity, I really like how the message in this website is not obnoxious, just lightly asserting a few arguments in the “why”/rationale section, and leaving the freedom of opinion to the reader. So refreshing in an internet where so many pages try to influence my emotions. Thanks!

                                          As to ranking, I understand your idea and appreciate that this is also not really stated as “good vs. bad”.

                                          With that said, some further thoughts/comments/ideas/questions I got after checking the site and letting my mind wander after a few comments below:

                                          • as to the ranking approach: as I said, I totally appreciate and respect your decision. I just got one idea based on what you did, that I personally think could be interesting to explore (maybe as an alternative view/subpage?) - even if more risky and possibly easily gameable: I’d be curious for an attempt at ranking by a ratio between: {number of bytes after processing the site with a readability-like filter} / {number of bytes the site has unprocessed}. Kinda “signal to noise” ratio experiment. Part of my motivation is, that the top pages in the current ranking seem to just have little actual contents, not giving me much to sink my teeth into…
                                          • I wonder, is the list periodically re-“policed”, to make sure pages don’t stay on it if owners add some JS later? edit: In particular, the btbytes page that I also mention below, seems to now have JS! 😛
                                          • thirdly, the btbytes page seems to give me a HTTPS error in my browser (page being btbytes…, but cert being for www.btbytes…); I understand there’s no explicit requirement for HTTPS working OK to submit, yet… dunno, wonder if that’s some area where a bit more care could be put somehow?
                                          1. 1

                                            I wonder, is the list periodically re-“policed”, to make sure pages don’t stay on it if owners add some JS later?

                                            Ideally, there should be automatic retaliation against sites that use JS, not only automated removal. ;)

                                          2. 4

                                            I also agree with some of the comments here. Its not about having “No” JS, but making it work well with decent fallbacks. I try to build web apps that gracefully degrade where your primary functionality isn’t lost just because you use elinks or a browser with JS turned off 😀

                                            1. 2

                                              Can’t make the list because I use JavaScript for progressive enhancement. A “NoJS” movement is like the “JavaScript all the things” movement, I disagree with both of them. I agree with using JavaScript for progressive enhancement.