I’m totally on board with the idea that the web includes an endless supply of annoying websites, but I would like to say that the second website shown with an image here has a background (and font) which I don’t find nice to read, which is also annoying.
Edited to add: oh, and Paul Graham’s website, featured on the list linked from this post, doesn’t appear to use HTTPS, which annoys me, at least, because it causes HTTPS Everywhere to pop up annoyingly, and because I have a vague understanding that using HTTPS all the time is understood to be good for security.
6 of the submissions are http only. They’re scraped from HN, and I’ve confirmed that the Rachel By The Bay submission was submitted with http but also works with https.
Graham seems to be hosting his site on yahoo’s servers still, at least if you replace the http with https you get a scary warning that there’s a certification mismatch.
Is a “nice site” an euphemism for “sites, with authors, that don’t understand even the slightest about design principles”?
I generally find the lack of any styling to be a readability nightmare, and the screenshots shown here tickle that nightmare sense. I’m all for fast loading, text only designs. I’m not OK with ignoring hundreds of years of experience related to laying out text in a readable way.
I dislike bad typography too, but I’d take a site with awful design and valuable content over a readable site with beautiful design but trash ‘content marketing-content’. I can always turn on the browser’s readable mode. Not true for quality.
zero css looked fine in days where all screens were roughly 15 inches diagonally - nowadays it’s readability hell across devices.
the vast majority of these sites could be fixed with two rules: max-width: 40em and margin: auto to center it. That’s it, that’s all you need to jump on the brutalist trend while keeping things readable.
I find this to be a worthy endeavor and I think focusing on the author’s angle on what is annoying or not misses the point of why such filtering can be valuable: this and similar algorithms ends up working very well for filtering (content) marketing and product sites, and shows you some hidden gems. Reliably. I love it! It is a nice addition to search.marginalia.nu (my other goto for valuable content)
I’m totally on board with the idea that the web includes an endless supply of annoying websites, but I would like to say that the second website shown with an image here has a background (and font) which I don’t find nice to read, which is also annoying.
Edited to add: oh, and Paul Graham’s website, featured on the list linked from this post, doesn’t appear to use HTTPS, which annoys me, at least, because it causes HTTPS Everywhere to pop up annoyingly, and because I have a vague understanding that using HTTPS all the time is understood to be good for security.
6 of the submissions are http only. They’re scraped from HN, and I’ve confirmed that the Rachel By The Bay submission was submitted with http but also works with https.
Graham seems to be hosting his site on yahoo’s servers still, at least if you replace the http with https you get a scary warning that there’s a certification mismatch.
Is a “nice site” an euphemism for “sites, with authors, that don’t understand even the slightest about design principles”?
I generally find the lack of any styling to be a readability nightmare, and the screenshots shown here tickle that nightmare sense. I’m all for fast loading, text only designs. I’m not OK with ignoring hundreds of years of experience related to laying out text in a readable way.
I dislike bad typography too, but I’d take a site with awful design and valuable content over a readable site with beautiful design but trash ‘content marketing-content’. I can always turn on the browser’s readable mode. Not true for quality.
The reader mode, traditionally, existed mostly to make a distraction and ad free experience. Web site authors should take note. :)
zero css looked fine in days where all screens were roughly 15 inches diagonally - nowadays it’s readability hell across devices.
the vast majority of these sites could be fixed with two rules: max-width: 40em and margin: auto to center it. That’s it, that’s all you need to jump on the brutalist trend while keeping things readable.
related reading: https://journals.uc.edu/index.php/vl/article/view/5765/4629 (which, funnily enough, seems to have either been scanned in wrong or the author is just having some ironic fun with the layout)
Yeah, the max-width is typically my biggest concern. But I do want slightly more line spacing most of the time, as well.
This reminds me of this guideline: https://brutalist-web.design/
I find this to be a worthy endeavor and I think focusing on the author’s angle on what is annoying or not misses the point of why such filtering can be valuable: this and similar algorithms ends up working very well for filtering (content) marketing and product sites, and shows you some hidden gems. Reliably. I love it! It is a nice addition to search.marginalia.nu (my other goto for valuable content)