1. 25
  1.  

  2. 9

    Most of what I want out of a website is unformatted text and jump links. Given that, the appropriate (already existing) technology to deliver it with the minimum of fuss is not HTML+CSS but gophermaps! I’d love to see people use gopher+gophermaps instead of HTML+HTTP when formatting isn’t necessary, though that’s an even bigger leap than getting people to avoid CMSes for that use case.

    1. 2

      Okay. gopher://i-logout.cz/1/en/bongusta/ and gopher://gopher.black/1/moku-pona are two feeds of gopher based blogs (aka phlogs). Read and enjoy.

      1. 1

        Thanks! I’ve read Alex Schroder’s phlog but I wasn’t aware of phlog aggregators

    2. 5

      One thing about SPAs…they seemed to be really popular starting with the rise of Rails, mostly as a way of compensating for Rails amazingly slow rendering.

      1. 1

        rendering? … I thought Rails was backend?

        1. 2

          Server-side rendering is a thing.

          1. 2

            is the process of building an html document to send to the browser called “rendering”?

            1. 7

              Yes.

              1. 2

                Also, there’s (eg) react-rails which does server-rendering of a react SPA (so you get the HTML which your react code would generate, served by rails).

          2. 1

            I remember the rise of Rails to be mid-to-late 2000s - I don’t remember seeing SPAs until the mid-2010s.

          3. 4

            The problem is not JavaScript or frontend libraries. I have a VueJS SPA with 100% client side rendering. I had it tested on a 7 year old desktop and it was still super fast. The first page load was 300kB and then every page load after that was about 10kB. It’s very easy to make a very fast website thats JS reliant.

            The problem is almost entirely from 3rd party scripts. So many websites import 30 tracking scripts from other sources that are poorly written and make the website slow to a crawl. I also think many of the big websites like twitter and reddit were made slow on purpose for mobile so that you install their app.

            1. 3

              I find it ironic that this article complaining about overcomplication of web sites, uses 146 requests to render, to 6 different hosts, which when saved to disk results in 1.56 MB of data.

              1. 3

                The problem today is that while what we need is minimalist websites, what we have is advertiser driven website development and that leads to a lot of crap being shipped that isn’t necessary or even consumer friendly.

                My personal preference is developing for sub 90ms page loads with image assets limited to 30KB each (most end up being sub 12KB). In seeking faster page loads I have in the past used tools that analyse all the pages of the site and trim out any un-used css which in one extreme case cut down over 100KB.

                Its this attempt to reduce page load times and asset weight that lead me to become an advocate of static site generators - even going so far as to write my own.

                1. 4

                  Apparently these days several people are (re)discovering the issues of the Web:

                  Not to talk about centralization, Cambridge Analytica and other more general geopolitical issues about the Internet.

                  The problem however seems that either people do not understand these issues or they benefit from them.

                  I mean, except users.

                  1. 2

                    Publishers know what they are doing. Nobody cares about people wanting to block Javascript.

                    You can disable Javascript of course and at this point the web is still usable without it. However publishers will increasingly turn to protections against JS blockers, you can thank the increasing aggressiveness and popularity of ad blocking extensions for that.

                    1. 4

                      You can disable JavaScript but not with a usable UI, so practically most people cannot.

                      Also, JavaScript should be disabled by default, enabled on a per site basis and marked as “Not Secure” anyway.
                      Browsers should make SRI mandatory for JavaScript and they should take special attention to suspect HTTP headers used with scripts.

                      1. 1

                        Interestingly sites like ebay and amazon do work fine without javascript. Not quite as comfortable but no quirks there either. Ebay has gotten worse over the years I admit….

                      2. 2

                        Their is a fairly good compromise. I use uMatrix which blocks 3rd party scripts by default and gives you a ui to enable them as needed. Quite often it doesn’t break anything and when it does it’s usually super easy to work out that a script from a cdn is important but a script from twitter or google analytics is not.