1. 47
  1.  

  2. 5

    I want to see a browser with a “block large files” option. If a file goes over 50kB, it is cut off (or just never downloaded, if possible). Images, fonts, scripts, video. I should get a snackbar that will let me consent to download and render with the remainder. Add a cumulative max per page to prevent scripts from just being split up.

    I pay per MB in a lot of situations, so I should have control. If a site looks funky as a result, I will accept that trade-off. If it is broken, I will be empowered to decide if I want to spend the remaining bandwidth or abandon reading if the value proposition isn’t there.

    1. 1

      I use gfx.downloadable_fonts.enabled;false in SeaMonkey, which is great, but it does cause most websites to look quite funny.

      I’m still looking for more ways to hinder the useless javascript event processing and web bloat in my browser; it’s amazing how all browsers compete for the fastest rendering engines, all the while ignoring the issues that one would experience when having more than one window open at the same time.

      How about a performance test where you open a couple tabs in a couple windows, of crappy bloat sites like mashable.com or even mail.yandex.ru, which track your every action and inaction, and test how much the inactive tabs and windows bring down the machine?

    2. 1

      He makes some good points about bloat.

      But I must disagree with his opinions on mobile-first web design; the majority of world’s Internet users are now on mobile devices and this will only be more true in the future. It does not make sense to cater to {desk,lap}top users with large screens at the expense of the much larger population of users with much smaller screens. As time progresses, large screens will increasingly become anomalous and be treated as such, deservedly so.

      Finally, I think he’s tilting at windmills when it comes to the Web as a participatory system of independent creators who hand-code HTML+CSS. Very few people ever did that to begin with and the newer Internet users are even less likely to do this. Most people express themselves online via Facebook, which has become what AOL always wanted to be. There’s plenty of competition from Instagram (owned by Facebook) for pure imagery and Youtube for video, with Medium serving as the refuge for those who still believe in the superiority of text.

      1. 14

        Windmills, perhaps, but I think it’s sad that we’re repeating all the infinite CPU mistakes by assuming infinite bandwidth. For decades, computers doubled in speed regularly, but Windows took longer and longer to boot (to pick one example, but an apt analogy I think). This was finally addressed but only after I think irreparable harm to MS’s reputation. And then they had to start selling their own “clean” PCs because it got so bad, but only after Apples “I’m a Mac vs PC” ads destroyed their market image.

        There will come a day when my network link stops doubling in speed, much like my CPU. Facebook is already selling their “clean” internet. I think it’s possible to change course, but the incentives won’t be aligned right until too late.

        1. 2

          The difference is that (at least on the phones) Facebook still loads just fine over dialup / 2,5G EDGE / once you are out of your high-speed 4G allocation, whereas it takes a few minutes to load even sites like http://www.CricketWireless.com/ on their own network once your 4G data is up and you’re down to 128kbps!

          https://code.facebook.com/posts/1556407321275493/building-for-emerging-markets-the-story-behind-2g-tuesdays/

          Some Android apps appear to completely stop working over 2G, probably due to incorrect timeout timers; I recall having problems with the uploads from the official Twitter app never working when the upload is limited to 128kbps or less.