My back looks like a pin cushion from all the arrows I received over the years fighting for web that would be more ethical and void of mostly useless crap. Some battles won, too many lost. I lost one just yesterday, but it didn’t occur to me that it was because of my money-induced blindness.
I actually like this quote and have used it myself before, but while I met many web developers over the years who didn’t care about bullshit described in the article, almost all of them didn’t simply because they were either ignorant of available technologies, didn’t care much about quality of anything they did and most often both.
Example of a small recent one would be Klevio website (as it currently exists, less so after today). I am not linking to it because I don’t want referrals from Lobsters to show up in website’s logs, but is trivial to find.
Almost everything on this website works with Javascript turned off. It uses Javascript to augment experience, but does not needlessly rely on external libraries. Should work reasonably well even on poor connections. Does not track you and still has a privacy policy handling that tries to be closer to the spirit of GDPR then to what you may get away with.
It would certainly be easier for me and faster to develop (cheaper for company) if I just leaned on existing tools, build yet another SPA and have not spent more than a week arguing with lawyers about what is required.
Alas, because unsurprisingly most people do not opt-in to analytics, I am now working on a different confirmation dialog, more in line with what others are doing. It will still be better than most, but certainly more coercive than current.
And this is in a company that is, based on my experience, far more conscientious about people’s privacy than others I worked for.
It would certainly be easier for me and faster to develop (cheaper for company) if I just leaned on existing tools, build yet another SPA and have not spent more than a week arguing with lawyers about what is required.
Is this really true? Not to downplay your craft but I always thought tinkering with HTML/CSS until things look right would be way easier than learning a separate library.
I checked out that website and it’s pretty refreshing that stuff actually works. If you want a little constructive feedback, the information density is very low especially on a desktop computer with a widescreen monitor. I have to scroll down 7 screens to get all the information, which could have fit on a single screen. Same with the “about us” page. I notice the site is responsive, giving a hamburger when you narrow your window, so maybe the “non-mobile” interface could be more optimized for desktop use.
I don’t think it is in every case, but in this one I think it would be since everything was handwritten without picking up existing solutions for things like galleries. If you mean the SPA part, then I guess it becomes more moot. It would probably be about the same doing the first implementation, but this one, which is basically a bunch of static files, certainly has a higher cost of maintenance because we (I) didn’t get around to finishing it so page “components” still have to be manually copied to new files and updated everywhere when their content changes. The plan was to automate most of this, but we haven’t spent the time on it yet.
I agree with everything in the second paragraph. Regretfully that is one of those battles lost.
If I remember correctly it was because it supposedly looks modern, clean and in-line with company’s brand. It has been a while so my memory is fuzzy on this.
I’ve heard this a few times already, but I’ve never quite understood what the implication is. What precisely are web developers not understanding? I get the default examples (eg. oil companies funding environmental research), but just can’t see the analogy in this case.
You’re on week three of your new job at a big city ad and design firm. Getting that first paycheck was nice, but the credit card bill from the moving expenses is coming up, that first month of big city rent wiped out your savings, and you don’t really have a local personal network to find new jobs. The customer wants a fourth “tag” for analytics tracking. Do you:
Put it in
Engage in a debate about engineering ethics with your boss and his boss (who drives a white Range Rover and always seems to have the sniffles after lunch) culminating with someone screaming and you storming out, never to return?
I only recently started using noscript. A lot of people balk at the fact that the majority of websites don’t work anymore after you install it, and the fact that you have to manually unlock specific scripts, and even think about which scripts you want to allow. It is certainly not something the everyday user wants to deal with. But the speed with which pages load, and the complete absence of all the spying and autoplay videos and the majority of images makes it really worth it.
Obviously the better solution for everyone is for web designers to get their shit together on this issue. But I am not holding my breath. For now noscript is as necessary as adblock ublock origin for having a positive experience of the internet.
It also teaches you who your friends are - the websites that just work as though the plugin were not there are the good ones. The ones that tell you you need to enable javascript and load all scripts directly from their own domain are also resolved with a single click. The ones that are a major hassle to use with noscript running are the ones you should probably be staying away from anyway.
Not just technical people either: an old friend used to train laypeople to use it on NoScript forums. He said there was a small, but steady, stream of them concerned about privacy and/or speeding up machines.
No, you block on a domain basis so that security hole is not even needed to get around it. Its not going to save you from the government, just bloated websites and advertising.
Anyway, if I understand what you mean, once the JS execution is enabled for a host, the server could serve a malicious script to you without being noticed, so that bug could be exploited, not only by the government but by several private companies…
True. Indeed I said that it could mitigate that vulnerability.
As @enkiv2 said in the lobsters’ thread about it, the only reliable solution is to remove scripting languages from browsers. A pretty expensive security fix, I know, but the bug is very dangerous.
I’ve been reading and upvoting rants about website bloat for several years now, based on scores so do many others, and yet every year it gets worse. It seems like the opinion of users on this kind of site is simply insufficient to change how websites are built, which is a bit surprising since people who build websites are presumably part of the core audience.
I’m resigned at this point. The call to action tacked on to the end of the long list of complaints about modern web development practices feels profoundly empty. Even if every developer at Hill, Politico, and CNN read and wholeheartedly agreed with the sentiment here, the people who actually make the relevant decisions won’t, and even if they did I expect they wouldn’t care. The politics of large organizations make it a lot easier to sell the idea of more advertising or some slick animated thing you can show off in a meeting than a shorter waterfall chart or “authenticity” (presumably the opposite of “bullshit” per the definition offered in this article). The CNNs of the world are going to continue to get worse, there’s no stopping it, you can opt out or try to buy sufficiently good connectivity/hardware to mitigate it but you’re not going to write blog posts to win over the hearts and minds of those who can actually do something about it.
I agree that we should reduce website bloat, tracking scripts, etc. But these essays are a bit like essays decrying factory farming. Everyone agrees its bad, but unless you can come up with an economic incentive for the companies to care, to change their business models, the situation wont change. It’s hard to change the world with just strongly-worded essays, no matter how right you are.
This is where I see Google ranking slow website lower is a benefit. Very few platforms have the same power to force websites to be more respectful of users’ time and resources. I suppose the other big social media platforms can do the same down-ranking for slow/bad websites. However, since these companies are all dependent on ad dollars, you don’t end up winning all that much at the global level.
Well, there’s also regulation, the traditional solution for companies that are producing negative externalities. In this climate, at least in the US, no one really is willing to propose or implement much regulation of any kind. Ideology uber alles.
I feel the way that websites have filled the increasing bandwidth available to visitors is similar to how people will increase their spending to match increases in income, making some people with $100k income with as little spending power as someone with $30k income.
For example, after a pay-rise someone might buy a bigger house leading to them having just as much spare cash as before is similar to a website noticing their visitors having 2Mbit lines now so they begin filling it with autoplay videos…
This is really a non-issue as far as I’m concerned.
Browsers (either standalone or with plugins) let users turn off images, turn off Javascript, override or ignore stylesheets, block web fonts, block video/flash, and block advertisements and tracking. Users can opt-out of almost any part of the web if it bothers them.
On top of that, nobody’s twisting anybody’s arm to visit “heavy” sites like CNN. If CNN loads too much crap, visit a lighter site. They probably won’t be as biased as CNN, either.
Nobody pays attention to these rants because at the end of the day they’re just some random people stating their arbitrary opinions. Rewind 10 or 15 or 20 years and Flash was killing the web, or Javascript, or CSS, or the img tag, or table based layouts, or whatever.
Rewind 10 or 15 or 20 years and Flash was killing the web, or Javascript, or CSS, or the img tag, or table based layouts, or whatever
Flash and table based layouts really were and, to the extent that you still see them, are either hostile or opaque to people who require something like a screen reader to use a website. Abuse of javascript or images excludes people with low end hardware. Sure you can disable these things but it’s all too common that there is no functional fallback (apparently I can’t even vote or reply here without javascript being on).
Are these things “killing the web” in the sense that the web is going to stop existing as a result? Of course not, but the fact that they don’t render the web totally unusable is not a valid defense of abuses of these practices.
I wouldn’t call any of those things “abuses”, though.
Maybe it all boils down to where the line is drawn between supported hardware and hardware too old to use on the modern web, and everybody will have different opinions. Should I be able to still browser the web on my old 100 Mhz Petnium with 8 Mb of RAM? I could in 1996…
Should I be able to still browser the web on my old 100 Mhz Petnium with 8 Mb of RAM?
To view similar information? Absolutely. If what I learn after viewing a web page hasn’t changed, then neither should the requirements to view it. If a 3D visualization helps me learn fluid dynamics, ok, bring it on, but if it’s page of Cicero quotes, let’s stick with the text, shall we?
I wouldn’t call any of those things “abuses”, though.
I think table based layouts are really pretty uncontroversially an abuse. The spec explicitly forbids it.
The rest are tradeoffs, they’re not wrong 100% of the time. If you wanted to make youtube in 2005 presumably you had to use flash and people didn’t criticize that, it was the corporate website that required flash for no apparent reason that drew fire. The question that needs to be asked is if the cost is worth the benefit. The reason people like to call out news sites is they haven’t really seen meaningfully new features in two decades (they’re still primarily textual content, presented with pretty similar style, maybe with images and hyperlinks. All things that 90s hardware could handle just fine) but somehow the basic experience requires 10? 20? 100 times the resources? What did we buy with all that bandwidth and CPU time? Nothing except user-hostile advertising as far as I can tell.
If you wanted to make youtube in 2005 presumably you had to use flash and people didn’t criticize that
At the time (ok, 2007, same era) I had a browser extension that let people view YouTube without flash by swapping the flash embed for a direct video embed. Was faster and cleaner than the flash-based UI.
nobody’s twisting anybody’s arm to visit “heavy” sites like CNN
Exactly. It’s not a “web developers are making the web bloated” problem, it’s a “news organizations are desperate to make money and are convinced that personalized advertising and tons of statistics (Big Data!!) will help them” problem.
Lobsters is light, HN, MetaFilter, Reddit, GitHub, GitLab, personal sites/blogs, various wikis, forums, issue trackers, control panels… Most of the stuff I use is really not bloated.
If you’re reading general world news all day… stop :)
“It is difficult to get a [web developer] to understand something, when [their] salary depends on [them] not understanding it.”
― Upton Sinclair
My back looks like a pin cushion from all the arrows I received over the years fighting for web that would be more ethical and void of mostly useless crap. Some battles won, too many lost. I lost one just yesterday, but it didn’t occur to me that it was because of my money-induced blindness.
I actually like this quote and have used it myself before, but while I met many web developers over the years who didn’t care about bullshit described in the article, almost all of them didn’t simply because they were either ignorant of available technologies, didn’t care much about quality of anything they did and most often both.
What were some of the wins?
Example of a small recent one would be Klevio website (as it currently exists, less so after today). I am not linking to it because I don’t want referrals from Lobsters to show up in website’s logs, but is trivial to find.
Almost everything on this website works with Javascript turned off. It uses Javascript to augment experience, but does not needlessly rely on external libraries. Should work reasonably well even on poor connections. Does not track you and still has a privacy policy handling that tries to be closer to the spirit of GDPR then to what you may get away with.
It would certainly be easier for me and faster to develop (cheaper for company) if I just leaned on existing tools, build yet another SPA and have not spent more than a week arguing with lawyers about what is required.
Alas, because unsurprisingly most people do not opt-in to analytics, I am now working on a different confirmation dialog, more in line with what others are doing. It will still be better than most, but certainly more coercive than current.
And this is in a company that is, based on my experience, far more conscientious about people’s privacy than others I worked for.
Is this really true? Not to downplay your craft but I always thought tinkering with HTML/CSS until things look right would be way easier than learning a separate library.
I checked out that website and it’s pretty refreshing that stuff actually works. If you want a little constructive feedback, the information density is very low especially on a desktop computer with a widescreen monitor. I have to scroll down 7 screens to get all the information, which could have fit on a single screen. Same with the “about us” page. I notice the site is responsive, giving a hamburger when you narrow your window, so maybe the “non-mobile” interface could be more optimized for desktop use.
I don’t think it is in every case, but in this one I think it would be since everything was handwritten without picking up existing solutions for things like galleries. If you mean the SPA part, then I guess it becomes more moot. It would probably be about the same doing the first implementation, but this one, which is basically a bunch of static files, certainly has a higher cost of maintenance because we (I) didn’t get around to finishing it so page “components” still have to be manually copied to new files and updated everywhere when their content changes. The plan was to automate most of this, but we haven’t spent the time on it yet.
I agree with everything in the second paragraph. Regretfully that is one of those battles lost.
so what do your managers feel is the benefit of having such low information density? how do these decisions get made?
If I remember correctly it was because it supposedly looks modern, clean and in-line with company’s brand. It has been a while so my memory is fuzzy on this.
I’ve heard this a few times already, but I’ve never quite understood what the implication is. What precisely are web developers not understanding? I get the default examples (eg. oil companies funding environmental research), but just can’t see the analogy in this case.
You’re on week three of your new job at a big city ad and design firm. Getting that first paycheck was nice, but the credit card bill from the moving expenses is coming up, that first month of big city rent wiped out your savings, and you don’t really have a local personal network to find new jobs. The customer wants a fourth “tag” for analytics tracking. Do you:
Web devs know that auto play videos and newsletter pop ups are annoying but annoying people is profitable
I only recently started using noscript. A lot of people balk at the fact that the majority of websites don’t work anymore after you install it, and the fact that you have to manually unlock specific scripts, and even think about which scripts you want to allow. It is certainly not something the everyday user wants to deal with. But the speed with which pages load, and the complete absence of all the spying and autoplay videos and the majority of images makes it really worth it.
Obviously the better solution for everyone is for web designers to get their shit together on this issue. But I am not holding my breath. For now noscript is as necessary as
adblockublock origin for having a positive experience of the internet.It also teaches you who your friends are - the websites that just work as though the plugin were not there are the good ones. The ones that tell you you need to enable javascript and load all scripts directly from their own domain are also resolved with a single click. The ones that are a major hassle to use with noscript running are the ones you should probably be staying away from anyway.
Not just technical people either: an old friend used to train laypeople to use it on NoScript forums. He said there was a small, but steady, stream of them concerned about privacy and/or speeding up machines.
Does it warn you if the scripts contents have changed?
If so, it might mitigate a little this huge security hole hidden in plain sight… but I’m not much sure…
No, you block on a domain basis so that security hole is not even needed to get around it. Its not going to save you from the government, just bloated websites and advertising.
So… do you enable whole CDNs?
Anyway, if I understand what you mean, once the JS execution is enabled for a host, the server could serve a malicious script to you without being noticed, so that bug could be exploited, not only by the government but by several private companies…
most people don’t audit the javascript code before they enable it anyway, so detecting changes wouldn’t solve the core issue
True. Indeed I said that it could mitigate that vulnerability.
As @enkiv2 said in the lobsters’ thread about it, the only reliable solution is to remove scripting languages from browsers. A pretty expensive security fix, I know, but the bug is very dangerous.
expensive in that it would save huge amounts of energy in the form of compute cycles that aren’t spent attacking the user
µBlock lets you block based on pairs of first & third parties.
I’ve been reading and upvoting rants about website bloat for several years now, based on scores so do many others, and yet every year it gets worse. It seems like the opinion of users on this kind of site is simply insufficient to change how websites are built, which is a bit surprising since people who build websites are presumably part of the core audience.
I’m resigned at this point. The call to action tacked on to the end of the long list of complaints about modern web development practices feels profoundly empty. Even if every developer at Hill, Politico, and CNN read and wholeheartedly agreed with the sentiment here, the people who actually make the relevant decisions won’t, and even if they did I expect they wouldn’t care. The politics of large organizations make it a lot easier to sell the idea of more advertising or some slick animated thing you can show off in a meeting than a shorter waterfall chart or “authenticity” (presumably the opposite of “bullshit” per the definition offered in this article). The CNNs of the world are going to continue to get worse, there’s no stopping it, you can opt out or try to buy sufficiently good connectivity/hardware to mitigate it but you’re not going to write blog posts to win over the hearts and minds of those who can actually do something about it.
what has failed is individuals resisting in isolation. what we haven’t tried is a unified movement.
Your second paragraph nails it. I bet most people here agree with these rants; but most of the people paying their salaries don’t care.
I agree that we should reduce website bloat, tracking scripts, etc. But these essays are a bit like essays decrying factory farming. Everyone agrees its bad, but unless you can come up with an economic incentive for the companies to care, to change their business models, the situation wont change. It’s hard to change the world with just strongly-worded essays, no matter how right you are.
This is where I see Google ranking slow website lower is a benefit. Very few platforms have the same power to force websites to be more respectful of users’ time and resources. I suppose the other big social media platforms can do the same down-ranking for slow/bad websites. However, since these companies are all dependent on ad dollars, you don’t end up winning all that much at the global level.
Well, there’s also regulation, the traditional solution for companies that are producing negative externalities. In this climate, at least in the US, no one really is willing to propose or implement much regulation of any kind. Ideology uber alles.
I feel the way that websites have filled the increasing bandwidth available to visitors is similar to how people will increase their spending to match increases in income, making some people with $100k income with as little spending power as someone with $30k income.
For example, after a pay-rise someone might buy a bigger house leading to them having just as much spare cash as before is similar to a website noticing their visitors having 2Mbit lines now so they begin filling it with autoplay videos…
This is really a non-issue as far as I’m concerned.
Browsers (either standalone or with plugins) let users turn off images, turn off Javascript, override or ignore stylesheets, block web fonts, block video/flash, and block advertisements and tracking. Users can opt-out of almost any part of the web if it bothers them.
On top of that, nobody’s twisting anybody’s arm to visit “heavy” sites like CNN. If CNN loads too much crap, visit a lighter site. They probably won’t be as biased as CNN, either.
Nobody pays attention to these rants because at the end of the day they’re just some random people stating their arbitrary opinions. Rewind 10 or 15 or 20 years and Flash was killing the web, or Javascript, or CSS, or the img tag, or table based layouts, or whatever.
Flash and table based layouts really were and, to the extent that you still see them, are either hostile or opaque to people who require something like a screen reader to use a website. Abuse of javascript or images excludes people with low end hardware. Sure you can disable these things but it’s all too common that there is no functional fallback (apparently I can’t even vote or reply here without javascript being on).
Are these things “killing the web” in the sense that the web is going to stop existing as a result? Of course not, but the fact that they don’t render the web totally unusable is not a valid defense of abuses of these practices.
I wouldn’t call any of those things “abuses”, though.
Maybe it all boils down to where the line is drawn between supported hardware and hardware too old to use on the modern web, and everybody will have different opinions. Should I be able to still browser the web on my old 100 Mhz Petnium with 8 Mb of RAM? I could in 1996…
To view similar information? Absolutely. If what I learn after viewing a web page hasn’t changed, then neither should the requirements to view it. If a 3D visualization helps me learn fluid dynamics, ok, bring it on, but if it’s page of Cicero quotes, let’s stick with the text, shall we?
I think table based layouts are really pretty uncontroversially an abuse. The spec explicitly forbids it.
The rest are tradeoffs, they’re not wrong 100% of the time. If you wanted to make youtube in 2005 presumably you had to use flash and people didn’t criticize that, it was the corporate website that required flash for no apparent reason that drew fire. The question that needs to be asked is if the cost is worth the benefit. The reason people like to call out news sites is they haven’t really seen meaningfully new features in two decades (they’re still primarily textual content, presented with pretty similar style, maybe with images and hyperlinks. All things that 90s hardware could handle just fine) but somehow the basic experience requires 10? 20? 100 times the resources? What did we buy with all that bandwidth and CPU time? Nothing except user-hostile advertising as far as I can tell.
At the time (ok, 2007, same era) I had a browser extension that let people view YouTube without flash by swapping the flash embed for a direct video embed. Was faster and cleaner than the flash-based UI.
Maybe you would like this one https://github.com/thisdotvoid/youtube-classic-extension
I’d say text-as-images and text-as-Flash from the pre-webfont era are abuses too.
Or just use http://lite.cnn.io
Exactly. It’s not a “web developers are making the web bloated” problem, it’s a “news organizations are desperate to make money and are convinced that personalized advertising and tons of statistics (Big Data!!) will help them” problem.
Lobsters is light, HN, MetaFilter, Reddit, GitHub, GitLab, personal sites/blogs, various wikis, forums, issue trackers, control panels… Most of the stuff I use is really not bloated.
If you’re reading general world news all day… stop :)