1. 42
  1.  

  2. 12

    Ironic that submission requires the relatively heavy process of a GitHub fork+pull request.

    1. 8

      The real irony is a page decrying “cancerous growth” of client-side javascript, that’s rendered (and thus dependent on) client-side javascript.

      1. 1

        It is not decrying the client side Javascript per se. It is about the huge frameworks and dead code that piles up in web pages everywhere to add some convenience for the developers. 250kb.club transfers 7kb of Javascript and half of it is the page data itself.

      2. 5

        Yeah, whats the problem with a mailing list? And why does the page need to be rendered with JS?

        1. 18

          Nobody likes mailing lists and nobody uses email and it’s a huge, “heavy” imposition to a huge amount of people, I suspect a large majority.

          I’m on this site, I’m deeply immersed in the type of culture that hates on github and loves email and such, and I still consider dealing with email to enough friction that I would only contribute if it were for some reason very important to me.

          1. 6

            nobody uses email

            [citation needed]

            1. 4

              Email is like cars: everyone uses it because they have to, but no one outside a certain set of enthusiasts enjoys it.

              1. 1

                But the email nonenjoyers enjoy GitHub issues?

                1. 1

                  It’s not actively objectionable. Email to me feels like it’s not a part of my culture. Emailing strangers feels wrong and is very high friction, mentally.

                  1. 1

                    I have literally no idea what you mean by this string of English words, sorry.

            2. 4

              nobody uses email

              squint

            3. 2

              Because I didn’t have time to figure out Svelte SSR yet. But eventually the page will be completely server rendered.

              1. 2

                I’ve been working on figuring it out because I have needed to. I really want to hold off on figuring it out any further until after svelte kit ships. (Sounds like end-of-year, probably?) Sapper feels really fiddly, especially around SSR. The new story sounds at least a little less fiddly, and likely to completely supplant sapper.

                1. 1

                  Could you share a link or something to this svelte kit thing you’re talking about? This somehow didn’t cross my feed yet.

                  1. 2

                    This blog post gives a good rundown, and the linked talk lets you get a view of the proposed workflow.

            4. 3

              I see (and share) this concern and currently set up the same process on Sourcehut (which is btw in the 250kb club).

              1. 3

                Good news : the page is now fully server-side rendered and works without Javascript. If JavaScript is supported, it enhances the functionality of the page (sorting of entries that is), otherwise that functionality is hidden.

              2. 11

                I think the “bloated” feeling that I get when browsing the web has more to do with ad related rendering and data collection than data used inefficiently to render the page content. Therefore, is seems like it’s missing the point to talk about this problem without specifically addressing ad supported content.

                “Let’s not talk about the elephant in the room, but I would like to bring up the presence of large piles of dung, and the occasional minor earthquake during feeding time”

                1. 1

                  I hear you. And I suggest you some really good ad-blocking. Sad enough that we need it not only for privacy but also because ad networks screw up website performance. If you have any more specific idea on how to figure out “ads heaviness” of a website, I’m all ear.

                  1. 1

                    I think the solution, at least for me, is more on the adaptive side than the activist side — for example, I read most articles in Notion, because it strips the content down automatically to just important links, pictures, and text (while also allowing me to tag and cross reference).

                    I would love for tools like that to be more available, and more powerful. It is a whole other shift in power when my client is not just blocking objectionable content on my behalf, but is only extracting and preserving the content that I care about for my own use.

                    A website can be as heavy as the publisher likes: it will only affect a background program on some server somewhere, and I will only see the text and images that my client/agent/browser has deemed of interest to me. I then have full control over the format and weight of what I see.

                2. 9

                  What about – hear me out – a 128kb club? One may say that 640K ought to be enough for anyone.

                  I do agree that web needs a diet. I don’t think that making a list of websites helps anyone. Making a copy of a list of websites is even less smart.

                  1. 7

                    Don’t worry, the club to end all clubs is here: https://0kb.club

                    1. 1

                      I really started something here xD I love it!

                    2. 2

                      Why are you so angry? How does making this list hurt anyone?

                      1. 22

                        I don’t think it hurts anyone, but I don’t think it helps either. In short, I think the question is really, what does it do? From my perspective, it does nothing.

                        And saying that it is basically useless does not mean I endorse heavy web pages. I am very much in favour of reducing bloat in sites. What I do find tiresome is the incessant complaining about it accompanied by poorly thought out rebellions, like trying to shame sites with arbitrary metrics or proposing dead-end protocols. Minimalism, in and of itself, should not be the goal. There needs to be a real incentive to making smaller sites that actually matters to those making them. Currently, no such thing exists. None of these proposals solve that or even acknowledge that it is a problem. It’s all predicated on the baseless belief that “smaller is better” will somehow win over hearts and minds. Rest assured that it won’t.

                        1. 3

                          There needs to be a real incentive to making smaller sites that actually matters to those making them

                          I think you misunderstand the motive for making larger sites in the first place.

                          These proposals won’t catch on in the large, because big companies are solely interested in profit. That is the only purpose of a company. They are not interested in technological innovation unless it services that goal. They are not interested in energy efficiency or any of the arguments you can make in favour of reducing web page size as long as the majority of their market has enough internet bandwidth and enough computing horsepower to cope. Actually, it’s beneficial to keep the status quo or build even more onto the tower of babel, not just for developers (despite the fact that alternative software could be written at a lower level of abstraction that does the exact same thing), but for companies and OEMs particularly – to propagate wasteful, costly technologies. Doing so ensures that, for example, telecommunication companies can continue getting grants for otherwise unnecessary infrastructure changes, or that OEMs can continue to sell laptops with ridiculously high ‘horsepower’ that really do not do anything different task-wise to laptops from a decade ago, and means Google et al can continue their trend of locking everyone in to Their Software™. It ensures that the wheel keeps spinning, if you will.

                          But the point isn’t that they catch on in the large, and it isn’t to ‘win hearts and minds’[0] the point is to pull back to not just a more minimalist web, but also to carve out new spaces as alternatives to a system that has been thoroughly branded, commoditized, and marketed. To create spaces that feel more personal, more familar, and are hand-crafted rather than Delivered and Targeted For You™

                          Sure, these new proposals won’t catch on ‘in the large’. That’s not the point here, at all.

                          [0] Whatever that means – seriously, there is no chance, at this time, of a new protocol taking hold regardless of the features it boasts. At least in the wide sense that you seem to mean.

                          1. 2

                            Sure, these new proposals won’t catch on ‘in the large’. That’s not the point here, at all.

                            Then they are doomed to obscurity and the constant complaining about the bloated web will continue.

                            1. 2

                              And once again, that is not the point.

                              The fact that they are creating new communities, new groups of specific styles of ideas, is valuable. Those connections with other people are valuable.

                              And regardless of that, if you are only concerned with the narrative of “we need to build the future and nothing else has any value”, it’s easy to show that time and time again, historically, small communities with ‘weird’ ideas serve to be melting pots for the technology of the future. We see this with places like, Xerox Parc, CSAIL, some Usenet sub-communities, LambdaMOO, etc.

                              Places like 2f30, suckless, 9front, et al led to the creation of Wayland, to a certain degree, for a modern example. Javascript and Python were influenced by Lisp, despite the fact that Lisp as a whole did not grow into an industry standard. Hell, despite my particular dislike about the borderline-abuse that goes on within the Merveilles community, they push out pretty good and interesting technology on the regular – orca has itself altered how a fair chunk of people create music interactively.

                              It’s not about creating the future directly, it’s about being able to communicate with people more directly and explore ideas that you wouldn’t have otherwise, and enjoying building unique spaces and connections with other people that are more genuine than those an algorithm has selected for you. It’s about creating spaces in which we can think differently, and from within that space, those ideas will in some shape or form, influence the future.

                          2. 1

                            I’ve had similar thoughts but didn’t have the words to express them as well. Lists like these are cool and all but like, so?

                            I’m more interested in things like code golfing. Don’t just have a minimal site, do something fantastical that only requires a small amount of scripting. That is impressive.

                      2. 4

                        I’m at 5kb for my homepage landing page.

                        Articles depend on text length and images but aren’t much bigger.

                        where’s my club at? ;-)

                        1. 2

                          Sadly, I’m over it:

                          • 3kb html
                          • 6kb css
                          • 67kb js for markdown support
                          • 480kb fonts

                          (Though one could argue that everything but the first 3kb will only be downloaded once and then cached.)

                          1. 2

                            I actually got rid of fonts on purpose. Do you convert markdown to html on the client? I do it during build time.

                            1. 1

                              I use Markdeep to convert some ASCII art to images, like here, so sadly my options are limited.

                            2. 2

                              Do you subset those fonts?

                              1. 1

                                I’m using off-the-shelf WOFF2 files, I’m hoping that in the future variable fonts will decrease file size without losing functionality.

                            3. 2

                              seriously. like here’s my latest blog: http://dpldocs.info/this-week-in-d/Blog.Posted_2020_11_02.html

                              37 KB of html. My tags are a bit bloated but still, mostly text and the nav sidebar. 21 KB stylesheet. Definitely bloated but meh. 6 KB javascript. (css and js are a copy of similar files for my tech documentation website.) 3 KB content image. All numbers uncompressed btw. Then… 7 KB favicon… lol, i used the actual .ico format. Might as well switch that to png one of these days and slash that back to like 1 KB. But still.

                              So zero effort optimizing this and I come in at 74 KB including all images. 250 KB is easy! I just focused on content so sure it could be optimized more but that focus kept me from wasting too much space naturally.

                              My personal homepage is 15 KB including a 10 KB image.

                              1. 2

                                Me too. (Mine is 5.17Kb HTML + 1.67Kb CSS.)

                                I think our club is the “I did it myself” club. It’s really hard to manually write a 100Kb website. It’s also really hard to reuse off the shelf JavaScript or styles and be under 10Kb.

                                Numbers like 250Kb have me reaching for my chest like I’m having a heart attack. 250Kb at 3Kb/sec is 83 seconds. Why would we put that on a pedestal? Anyone who worked on the web in the 90s can do a lot better - and that’s a lot of people who can do a lot better.

                                1. 2

                                  The reasoning for the chosen size limit is simple: Allow for reasonably complex web pages and web applications (like lobste.rs), not only simple, hand tailored about-me pages or blogs. See for example this comment thread here is 92kb compressed data and I think that is extremely good compared to “the standard”.

                              2. 3

                                This site doesn’t look great on mobole, which is sadly ironic.

                                1. 1

                                  Can you tell me what issues you’re having or write an issue on github or sourcehut?

                                  1. 1

                                    Looks fine in Firefox on Android?

                                  2. 3

                                    This is all good for personal home pages, etc. But what about complex pages? Google docs? Aviasales.com?

                                    1. 5

                                      Those are applications that people run in browser, rather than websites. A lot of them could also have been more lightweight, though.

                                      A page from a corporate website needs as good a justification to be over 250k as a personal one.

                                      1. 1

                                        The OP said in a comment above that they want to run reasonably complex web applications. Maybe not Docs but your regular web app should fit. Which can get tricky of you’re using e.g. angular or react with some component library.

                                    2. 3

                                      If your page could have been written to provide the exact same functionality using php5, 15 years ago, but without the javascript dependence, then just maybe a client-side javascript app isn’t necessary, hmm?

                                      This is just using (client side) javascript for the sake of using it, there’s no benefit to this being rendered client side, and there are several downsides… one of which, this site specifically calls out as “cancerous growth”.

                                      1. 1

                                        There is a huge benefit actually. The data for rendering the page is much smaller than the rendered markup. So I have 6kb of Javascript that includes logic, markup template and data. That’s efficiency. But as written before, I want to render this server side eventually. With more and more sites coming into the list, rendering everything at once doesn’t make sense anymore, so pagination will be necessary.

                                        1. 3

                                          Looking at one example: a single entry in JSON is about 100 bytes, while the resulting markup is about 300 bytes. So 1/3 the bytes.

                                          But the data embedded in the js file is only about 9.3K out of 17K, so you’re shipping as much logic as data right now.

                                          So if you’d just rendered it server side from the start, without any JS, your transfer size for the page would be slightly higher, but the page would work without any javascript.

                                      2. 2

                                        In case anyone else wondered “why not 256”, it’s likely because https://1mb.club is a megabyte (1000 * 1000 bytes), not a mebibyte (1024 * 1024 bytes).

                                        1. 1

                                          In all honesty, I regret to not have chosen 256kb.club instead. Just came to me the day after. But I think now it would be stupid to change everything. The truth is though: The real rejection threshold inside the application is 256kb.

                                        2. 2

                                          Less bloated than the one I saw in HN recently.

                                          1. 2

                                            I like the general idea (and hate bloated websites), but it might be useful to classify the content more finely than just measure the raw bytes. It is difficult to design a good methodology, but… If some website contains e.g. some meaningful pictures related to the article, it is wisely used bandwidth/power, even if it is more than 250 kB*. On the other hand, if some website contains just 10 kB JavaScript that wastes my CPU and RAM, it is much worse than even much more bytes of pictures. Another annoying thing is loading content from many different domains and tracking…

                                            *) BTW: Do you mean kilobytes or kilobits? I expect kilobytes when we talk about file sizes, but then it should be kB or KiB – not kb, which means kilobits.

                                            1. 1

                                              You’re right. It’s tricky to design a metric that is reasonably automatizable. And yes, I mean kilobytes. One quarter Kibibyte is the technical limit, although the name suggests one quarter kilobyte. But I’m not going to change that because it’s not about the size limit only.

                                            2. 1

                                              I wonder how do they measure? Is it the amount of data up until the DOMContentLoaded event? Is it the upper bound total, regarding even the async transfers? (“oops, I clicked the shopping cart icon and it downloaded a massive 3MB file from PayPal”) Is the the code size on a given repository?

                                              It would be good to have a more rigorous set of metrics…

                                              1. 2

                                                The former it is. Size up to DOMContentLoaded, no interaction. It is about the initial load time which makes people wait with nothing, paired with additional metrics (that I still need to figure out in a better way) that show content vs bloat* size.

                                                (*) of course “bloat” is subjective. In this case it is everything except text, document structure, pictures and videos. This approach is not perfect and I’m open for ideas.

                                              2. 1

                                                just checked a few of the pages on my website, a long blog post is actually 40kb of text it seems. Yes, I could strip down the 160 kb of CSS, but with everything I looked at under 250kb, I’m pretty happy.

                                                But I don’t think we can make a difference. Not even the most influential of solo website owners or bloggers can. This battle has been lost.

                                                1. 1

                                                  I really like the discussion sparked here. Thanks again to @lucian for posting my page.

                                                  While I enjoy the foundational discussion about the topic, I’d also like to spark a discussion about how to make the page even more useful than just a list of URLs. One thing I started with is the introduction of a second metric after page size, which is the size-to-content ratio. It is not perfect yet but it might points in a good direction already. There is also a blog post by Rohan Kumar (and a discussion) where he tries to compile a list of good practices for “simple” websites. I’d love to take this further and make it a list of meaningful measurable metrics that are then shown on the page. Maybe that can be an inspiration for some website makers.