1. 66
  1.  

  2. 18

    Something I’ve noticed–all these sites are now basically instantly usable. And sure, they’re not pretty, but I’m there to read things. There is a little clunkiness, but I think this trend is very promising.

    Shame they still have the massive blob of comments at the bottom.

    1. 10

      I think that efficient websites don’t have to be ugly or clunky.

      I’d love to get some feedback on my own site, which I believe is “pretty” (at least if you like its style) and loads very fast. (It requests some specific web fonts, but usability is not degraded if users block them/use their own.)

      Any bloat I missed?

      1. 11

        The combination of a thin font and the gray text cancer makes it hard to read.

        Custom fonts best be avoided if your intent is to share information.

        1. 2

          Thanks! I made the table row background brighter and made the links darker.

          Before: https://imgur.com/LCHhqby

          After: https://imgur.com/yM0kGu2

          What do you think?

          Edit: I slightly increased the background color again, it looked good on my desktop, but was almost invisible on my notebook.

          1. -4

            You people need to get your eyes checked. It’s very dark grey on very light grey. Fix your contrast or something.

            1. 16

              That was not enormously constructive and fairly antagonistic. I give you a gratuitous C-, and half a rotten trout.

              1. 3

                The font is too thin for the old (like me). I checked with my old eyes.

            2. 3

              I like that you have a natural categorization system - a table - on your front page. For many, that’s too many links or too much text. But humans are very good at scanning for information. So your design both respects the user’s abilities and provides them with a lot of information at once. Moreover, the table layout keeps the information legible. This is against the simple Apple-style minimalism.

              What I said above comes pretty much directly from Edward Tufte. I went to one of his one-day conferences, where he brought up the website of the National Weather Service. Specifically, he noted that the abundance of links at the bottom, categorized, ordered, and with headings was a great example of a design that provides maximal information. (The part about respecting the user is my point.)

              1. 2

                Thanks for letting me know this, I was never really sure if the grid was a good idea!

              2. 2

                Only complaint would be the GA blob, but ad blockers take care of that handily.

                1. 2

                  Thanks, I removed that. I don’t even remember adding it.

                2. 2

                  I feel similarly about my site, where an article which would be 12k in plaintext is 130k in total (or 95k over the wire because gzip), and that’s with all the features I want; automatic javascript-based code highlighting for most programming languages from highlight.js, tracking/analytics with (self-hosted) matomo, and as nice-looking CSS as I know how to make.

                  All it really takes is just to render the HTML on the server (either statically or dynamically) and not through javascript on the client, and not have ads or huge tracking suites or integration with social networks, and to be careful about putting auto-playing videos or huge images on the page.

                  The main page however is bigger than necessary at 1.13MB, exactly because I haven’t been careful about huge images on the page - and those images aren’t even visible. I have to fix that.

                  1. 3

                    as nice-looking CSS as I know how to make

                    And then I’ll come and use Reader Mode to get rid of that crap. :)

                    I wish the Web was universally usable without styling. Like paper.

                    1. 4

                      Paper? Doesn’t that medium permanently and physically link the style with the content?

                      … I’ve seen some awful design and layout on paper …

                      1. 1

                        yes it seems as though they’ve been hit with a debilitating wave of nostalgia.

                        1. 1

                          True, it’s one argument against publishing in PDF.

                          What I had in mind is that scientific papers, fiction books and such don’t tend to have much variance (partly because the target medium is usually black and white, ink is expensive, and printers don’t enjoy making halftones anyway). So my “without styling, like paper” means looking like your usual printed document. Simple, focusing on information.

                        2. 2

                          That’s okay :) I don’t use reader mode much myself, but I’m a huge proponent of having markup that’s semantic enough to let people consume the web however fits them best.

                          1. 1

                            I wish the Web was universally usable without styling. Like paper.

                            Amen. At least most blogs are, including this one. I read it without CSS.

                          2. 1

                            Interesting content! I particularly enjoyed the article about obscure C features :-)

                        3. 3

                          I also love the trend of debloating Web sites that need to serve a smaller cause - but what do you do on a website that needs, or promotes having the community and/or its readers participate in a discussion, which requires a comment section or something alike? I find this challenging, since many if not all of the sites that tried to minify themselves (and I came across) decided to remove the comment section, such as this case, instead of finding an “alternative”

                          1. 4

                            Well, the comedy option would be to show a discussion tree and just use an email client to submit to a thread, and on recieving the email posting appropriately.

                            Just because the frontend is not bloated doesn’t mean you can’t have stuff going on in the server.

                              1. 2

                                I suspect the trend of removing comments sections comes a lot from people thinking something like “oh while I’m removing all this unnecessary shit I might as well stop bothering with the burden of moderating comments and deleting spam too”.

                                Comments aren’t inherently heavyweight, we had guestbooks back in the 90s. ;)

                                Admittedly it is the case that a lot of the obvious off the shelf choices of software for running a comments section are kinda heavy.

                              2. 1

                                I’d be pretty interested in people trying out building usable sites beyond reading. Though perhaps web apps are doomed to not be fast and reactive on account of the need to fetch the data, it would be cool to see things like GMail but actually a super fast interface.

                              3. 5

                                Now that’s a good website to lose some weight. Very nice.

                                I’m happy that the movement of being conservative with bandwidth is gaining traction, there is nothing to lose from it. Even if the websites often look “worse”, I find they’re usually more readable, which is what actually matters.

                                1. 4

                                  I think his new design looks great! I’m also pretty happy that he does not impose a font on me; instead, he just declares the font-family to be Monospace (with a fallback to Courier) and that means that his website is displayed on my end with my monospace font of choice. (At the moment, that happens to be Input Mono.)

                                  1. 4

                                    The homepage requires 23 HTTP requests.

                                    Enable HTTP2 and this is no longer a problem. The page should load much faster.

                                    But yes, news websites are pretty much the worst websites on the web, they are worse than torrent websites now.. The only one with a usable website seems to be the Australian ABC which is government funded. I have a theory that news websites aren’t actually there to be read, they are targeted at creating a clickbait headline which gets users to click to the page, they read the first line and then maybe click on an advert by accident. When you actually read the full text from these “news” websites, they usually contain no actual content.

                                    1. 11

                                      Enable HTTP2 and this is no longer a problem.

                                      Unless it’s 23 trackers and taggers from 23 different domains.

                                      I wish people would stop tooting http2 as a cure all. 23 resources is 23 resources regardless of connection count.

                                      1. 1

                                        the point is that it’s an incidental metric, of which there are multiple ways of tackling the issue.

                                        If you have two images on your page, you can do some sorta sprite sheet magic, or http2 might actually solve this issue for you cleanly and basically cost 1 request.

                                        1. 1

                                          The number of requests is a red-herring, its the number of resources. If you have 100 resources and use HTTP2 to pull them all down on one connection that still means you have 100 resources bloating the page, slowing down the render time, consuming resources, wasting bandwidth, etc.

                                          1. 1

                                            It doesn’t matter if it’s one resource or 100. What matters is if it’s 1KiB or 1MiB, and whether it takes one round trip or ten.

                                      2. 3

                                        they are targeted at creating a clickbait headline which gets users to click to the page, they read the first line and then maybe click on an advert by accident.

                                        Take it further to get to the truth: news companies aren’t there to give you news but influence you for money from their actual, paying customers. They want you to look at the screen or site, see the ads, and/or click on them. They’ll say or do anything they can get away with. What makes people jump to an ad or stare at a screen is usually rhetorical and uncertain rather than hard facts. So, they use bullshit to the max.

                                        George Carlin elaborated on this more in his talk Bullshit is Everywhere (NSFW). Goes into a few industries. Notes the media is actually in intersection of all of them. So, it’s “exploding with bullshit.”

                                      3. 4

                                        Best example I can think of is gwern’s website. Clean, fast, pretty, minimal, content-centered and supports RSS. I also appreciate that it doesn’t follow the “blog”/timeline format of pretty much every website these days: It’s a homepage containing essays. Chronological order is irrelevant.

                                        1. 1
                                        2. 3

                                          Website bloat is not a technical problem, it’s a business problem. If CNN wants to make as much money as possible, they will stuff their website full of ad units and trackers to maximize per-click revenue. If someone just wants to publish their articles to the internet, they have no reason to add this cruft. Sure, there’s an amount of attention to performance that must be paid in either instance. But it’s primarily a business problem. Slack can also write their own native C++/Qt app, but their need for development simplicity/velocity trumps their care for memory usage. I still respect this author’s attempt to reduce the bloat on their blog.

                                          1. 1

                                            One could “literally stuff” a website full of ads and still make it blazingly fast. It would also be easy to circumvent ad-blockers.

                                            Host and serve the ads from the same domain as the content, weaving the ad itself into the content.

                                            No js, just text and images. The 90% motivator for me, for installing an ad-blocker is performance, next is obtrusiveness. And I hate ads in general, but performance and annoyance is much worse. On a technical site, I like seeing the ads (electronics, mechanical eng, process eng, etc).

                                            1. 2

                                              In today’s web advertising landscape, ads without tracking JS are not worth much. From the buyer’s point of view, it’s hard enough to trust click and impression numbers calculated from tracker scripts. (There’s lots of fraud.) Without anything like that, they’d just have to take the publisher’s word for it, trusting that they weeded out the bad clicks.

                                              I do agree that it’s possible to do advertising tastefully. One example is Daring Fireball, which has one discreet ad in the bottom left. It works for him because he has a dedicated, focused, high-value audience. It probably wont work for mass-market news sites because of the tracking/attribution problem.

                                              1. 1

                                                I think it could work for ad campaigns like BMW, Nike, Coke, etc. Where the brand is being pushed and it isn’t a niche thing, it has mass market broad consumer appeal. The way advertising used to work before this slow slide to a total surveillance state.

                                                I think there is a market for an ad-network that supplied targeting selectors and raw ad assets to the content providers directly, and the content providers supplies log samples showing impressions, click throughs would be obvious.

                                                But I agree it is counter to everything we have now, what we have now is a obtrusive broken mess. Allowing advertisers to inject arbitrary JS into the user’s browser crosses the line.

                                                1. 1

                                                  Fraudsters work around it anyway with click farms and proxies.

                                                  1. 1

                                                    Yes, their filters/analysis isn’t perfect. But ad buyers feel like at least they are the ones to make the judgments. To be fair, most ad buyers also hire middlemen to place bids. There’s many levels of uncertainty here.

                                            2. 2

                                              There’s also no reason the site needs to be boring, colors and appropriate styling aren’t what is slowing down pages. You could also offer an RSS feed which would be even faster.

                                              1. 2

                                                WRT news sites, I’ve gotta wonder if there’s a breaking point somewhere. Trying to surf anything like that without AdBlock feels like madness, with massive ads and auto-playing video popping up everywhere while you try to read 3 paragraphs of text. We’ve got to come up with something better eventually, right?

                                                1. 2

                                                  Seems like there’s a blog post on lobste.rs every six months about this from someone “discovering” the problem.

                                                    1. 1

                                                      I really don’t want this to be a plug/promotion, so apologies in advance if it comes across as one… but this is exactly why I stopped working for companies that build bloated websites (web agencies, Time, BBC) and took a job at SpeedCurve. We’re trying to build a tool that really demonstrates the impact that performance has on the people who use the web every day, and that helps developers make the changes required to deliver their pages quickly.

                                                      It’s frustrating that the status quo for a web page these days seems to be 5MB of JavaScript that manages to spin the fans on even my i7 XPS 13. How are people in the developing world supposed to browse the web on their $150 Android phones? I wish things were looking up but they’re getting worse every year and we’re well beyond the point of hardware being able to keep up.

                                                      I don’t really have a point to all of this rambling. I’m just so upset that the web has turned into this absolute shit show.

                                                      1. 0

                                                        This is the problem AMP is trying to solve *ducks*

                                                        1. 3

                                                          You could simply remove all the trackers and bloat and then have a fast website or you could remove all the trackers and bloat and then set up amp and have a fast website with google tracking. AMP is more work than just solving the problem.

                                                          1. 1

                                                            Yes definitely, agreed, if you stripped everything out of the website, it’d be faster than with AMP. What I was getting at was more solving the issue that average front-end developers have little incentive to combat bloat. How do you enforce that the CNN devs won’t add 15MB of JS to the page? AMP is an attempt at doing that.

                                                            (Also I didn’t mean to start a discussion about AMP, though I probably was asking for it with that comment)

                                                            Also to clarify one thing: AMP is just a vanilla JS library, there’s no Google tracking in it.

                                                            1. 1

                                                              By lowering their ranking in the search results

                                                          2. 1

                                                            http://idlewords.com/amp_static.html

                                                            But you are 100% right. ;) Regardless how good it goes, this is the problem AMP is trying to solve.