1. 95

  2. 22

    I did not expect this to go where it did, very informative.

    1. 17

      This is a great example of a concise blog post that is informative and easy to read.

      Well done!

      1. 11

        Ironically, demo doesn’t actually work because lobsters link ends with .html but the actual post apparently does not.

        1. 8

          Crisp and short. I like it!

          Here’s a (somewhat historical) document on how browsers have to lie wen using getComputedStyle https://dbaron.org/mozilla/visited-privacy

          1. 5

            For analysis of this, so many more things like this, and so much about web security history, approaches, analysis, and commentary, see the web hacking bible (only somewhat dated):

            Michal Zalewski’s The Tangled Web, considered a must-read in application security circles.

            1. 4

              The Tangled Web is wonderfully good at pointing out all these nasty little details. Indeed, it’s a bit dated because I think some of the bugs have been fixed by now, but if you read between the lines you notice that it’s really endemic to the way the web has been (and is being) developed and without a good vision for a secure future of the web, there is no way this will ever improve.

              Instead of actually figuring out why we have holes in our boat, we’ll just keep scooping water out of the boat and say “see, we’re still afloat”.

            2. 3


              I recall seeing something like this on HN (wow, a decade back) and the POC has been bitrotted away: https://news.ycombinator.com/item?id=617546

              1. 3

                I can recommend using the .invalid-tld for invalid links.

                1. 2

                  Or, alternatively, .example tld.

                2. 3

                  I wonder if this could have been patched by changing the programming model to be something more declarative. It’s not the CSS which is insecure, the JS which comes afterwards messes everything up!

                  A big change might be to put the DOM behind a curtain. The page can emit a DOM (which might include minimal support for scripting, definitely no network access) but can never read that DOM back out. In response to events it can emit a new DOM which is diffed and applied by the browser itself.

                  I think this would also allow relaxing the same-origin restriction? You still don’t want to allow buttons which hit unrelated domains and send the author money on venmo, but what’s the problem in letting a page pull in the list of my venmo payments, as long as the JS for that page can’t see it?

                  It’s a shame there’s no way to AJAX without javascript, surely there’s a purely-declarative model which could have been used to pull data from external sources.

                  1. 5

                    CSS is also insecure here.

                    By changing the size of a link inside a fixed-size container, I can alter how much space is left for an <img srcset and then record which file gets downloaded.

                    1. 1

                      Well damn, good point. Let’s just burn it all down.

                      This is insecure because all of the following are true:

                      • Your browser has a secret which shouldn’t be made public
                      • Your browser uses that secret when rendering a page
                      • After that secret has entered the equation your browser fetches more resources
                      • Those fetches can be used to infer the secret

                      It sounds like breaking any of the above would fix the problem, but I’m not sure where the break belongs:

                      • You probably want your browser to remember your history
                      • You probably want to style :visited links differently
                      • It’s hard to imagine an alternative to CSS which first made requests and only then injected secrets.
                      • How can you request resources without letting the world know what you’re requesting?

                      Maybe your browser should fetch every alternative? Or, requesting all content by hash (like ipfs) would mean the server which gets the request and can infer your secret is unlikely to be the server trying to track you?

                    2. 2

                      Putting sensitive data and untrusted code in a sandbox and hoping nothing leaks is a difficult problem.

                      1. 1

                        The idea is more, let’s not put untrusted code in there!

                        And since removing JS entirely is a lost cause, maybe we can get most of the benefit by adding a boundary beyond which it can’t reach.

                    3. 2

                      Interestingly enough, when clicking the sample link on Safari and go back to the page, it’s zoomed out to 10%, which I guess is a bug.

                      1. 2

                        This is a great blog post that was significantly more interesting than I had expected. Interesting, and depressing, because every time I learn more about the modern web, it gets more dystopic and nightmarish.

                        1. 0

                          Disabling font-size for a:visited is a brutal, but safer, solution.

                          Even if it wasn’t a security issue I’m pretty happy that you can’t change the font size of a visited link. What an annoying thing that would be, if it were possible.

                          1. [Comment from banned user removed]

                            1. 5

                              All this and more in tonight’s episode of “Massively Successful Phenomenon Is Flawed In Several Ways”!

                              1. 4

                                Eh… HTML predates XML by several years.

                                1. 3

                                  30 years ago today, Tim Berners-Lee submitted a memo for what became the web. The challenge worth feeling satisfied for is not to recognize historical mistakes with the benefit of three decades of hindsight and billions of users. It’s to repair that medium, or to replace it. Making zero mistakes along the way is the stuff of fantasy; the realistic dream is to make fewer, smaller, or at least new mistakes on top of the incredible success of the web.

                                  1. 2

                                    I can sit in comfort, knowing I’ll never design something so disgusting and malformed as the WWW.

                                    And we all are very glad of that!