I disagree heavily with the core thesis of this article–that Javascript is in need of replacement–but the treatment of it and the ideas explored are quite interesting.
I’d honestly settle for browsers handling JS the way they do cookies. Let me decide whether to allow all JS, allow only self-hosted JS, or disable JS entirely – and let me blacklist/whitelist particular domains.
You could write or use a browser extension that injects a Content Security Policy into the response. Make it configurable on a per-site basis is a stretch goal. :-)
This article is wild, and I will need to come back to it to fully appreciate it.
That said, I wonder if dynamic http requests are worth saving. The web is passably-good at serving static HTML, and everything else arguably really shouldn’t be done at all. Javascript is a hack inside a hack inside a hack, which, when used to exploit holes in the specification of HTTP, can be forced to slowly simulate a dynamic interactive UI inside a rich text viewer. It’s like playing Doom by fax machine: it’s cool that it’s possible, but if everybody starts doing it, it quickly becomes horrifying.
Hi! I’m the author of the article, but I didn’t see it posted here until recently.
Great points! In an ideal world, I’d agree with you, but I don’t think it is possible to change the way the web is used (even Lobste.rs falls under the “hack” category, as it’s more than a static document). It’s simply too useful. What we can do is change the technology behind it. It’s an area where I personally think declarative programming would really shine.
I agree on all points, aside from the idea that there’s a particularly large difference between HTML and CSS in terms of how entrenched the technologies are :)
I disagree heavily with the core thesis of this article–that Javascript is in need of replacement–but the treatment of it and the ideas explored are quite interesting.
I’d honestly settle for browsers handling JS the way they do cookies. Let me decide whether to allow all JS, allow only self-hosted JS, or disable JS entirely – and let me blacklist/whitelist particular domains.
Have you looked at umatrix?
I use a hosts file.
I use a hosts file too, but umatrix allows more fine-grained controls than just blocking all requests to a domain, in addition to doing things like only allowing iframes/cookies/media from domain X to be loaded from domain Y.
You could write or use a browser extension that injects a Content Security Policy into the response. Make it configurable on a per-site basis is a stretch goal. :-)
If the author’s here, your cert’s expired
This article is wild, and I will need to come back to it to fully appreciate it.
That said, I wonder if dynamic http requests are worth saving. The web is passably-good at serving static HTML, and everything else arguably really shouldn’t be done at all. Javascript is a hack inside a hack inside a hack, which, when used to exploit holes in the specification of HTTP, can be forced to slowly simulate a dynamic interactive UI inside a rich text viewer. It’s like playing Doom by fax machine: it’s cool that it’s possible, but if everybody starts doing it, it quickly becomes horrifying.
Hi! I’m the author of the article, but I didn’t see it posted here until recently.
Great points! In an ideal world, I’d agree with you, but I don’t think it is possible to change the way the web is used (even Lobste.rs falls under the “hack” category, as it’s more than a static document). It’s simply too useful. What we can do is change the technology behind it. It’s an area where I personally think declarative programming would really shine.
I agree on all points, aside from the idea that there’s a particularly large difference between HTML and CSS in terms of how entrenched the technologies are :)