1. 17
  1.  

  2. 12

    I’m going to repost a good comment from teraflop on MetaFilter here because y'all seem to be weirdly negative in this post.


    Sigh. This made the rounds on Hacker News yesterday. To head off some common complaints:

    • They’re talking about deprecating plaintext HTTP, not removing support for it.
    • There are good reasons for this. When they say “browser features”, what they’re mainly talking about are privacy-sensitive things like geolocation, or access to your microphone and webcam. For obvious reasons, these features require the user to explicitly provide permission. But if they’re used on an http:// site, you have no idea who you’re granting permission to. That code could have been modified by your ISP, or your government, or whoever set up the wi-fi at the coffee shop you’re sitting in, or just any random person on your local network.
    • This is not a hypothetical problem; ISPs (including Comcast) have already demonstrated that they’re willing to hijack your plaintext connections to inject ads.
    • HTTPS is becoming cheaper and easier to set up. This move is being done in conjunction with the Let’s Encrypt project, which aims to make SSL certificate setup free and effortless. If you don’t want to wait for that to take off, you can already get free certificates from StartSSL.
    • Yes, the CA architecture has problems. No, nobody’s come up with anything else that works as well. Mozilla isn’t doing this unilaterally; Chrome has already announced similar plans in the last few months.
    • For development purposes, “localhost” will continue to be treated as secure. If that isn’t good enough, creating your own internally-trusted CA is probably a lot easier than you’re imagining.
    • The fact that they’re talking about this now doesn’t mean it’s going to happen soon. Browser vendors are very serious about doing slow, methodical, careful rollout plans, even for much tinier compatibility issues than this one.

    Anything else?

    1. 2

      “For development purposes, “localhost” will continue to be treated as secure. If that isn’t good enough, creating your own internally-trusted CA is probably a lot easier than you’re imagining.”

      My development environment has many machines with private ip’s (192.168.0.0/16). I don’t like the argument that its “probably a lot easier than your imagining” because what you are doing is de legitimizing my complaint that my browser is now making me jump through a non-trivial infrastructure change. Just because you personally don’t see a problem with it, doesn’t speak to anyone else.

      In my mind the only and obvious way to address this it to make it OPTIONAL and give users the ability to turn it off.

      1. 1

        I wish browsers would change (or provide an option to change) the default protocol used when just typing in a bare domain name to https, instead of defaulting to http.

        1. 0

          I’d also add that the browsers pushing this (Chrome announced similar intent a month or so ago) are actually going to make the CAs more competitive, not less, and drive down the prices and process involved.

          1. 6

            When they say “browser features”, what they’re mainly talking about are privacy-sensitive things like geolocation, or access to your microphone and webcam.

            This is flatly not true. They are specifically talking about limiting new CSS properties and the like to HTTPS, not because of privacy concerns specific to those features, but as a way to manipulate people into using HTTPS.

            1. 6

              After thinking about this a little longer, it’s the manipulation that really rubs me the wrong way about this. Software–especially free software–should seek an honest relationship with its users. This is the opposite of that. They’re saying: we’ve tried to convince you that HTTPS is important, but clearly many of you have decided its importance is outweighed by its implementation difficulty. So rather than consider that maybe our arguments are not as compelling as we think they are, we’ve decided that we’re going take something unrelated that we know that you care about–the ability to (for end users) see and use websites the way they were intended to be seen and used, or (for site authors) the ability to use the same features as every other website on equal footing–and hold it hostage until you accede to our demands.

              Moreover, Mozilla is making this decision unilaterally. It has appointed itself to make these decisions for its users because it believes it knows better than those users. But it is accountable to no one. It could have gone before the relevant standards bodies and advocated for the official deprecation of HTTP. That would have been the honest and accountable way to try to effect the change it wants to effect. Instead it is attempting to use its position in the market–a position which gives it power over its users and over site authors–to do an end-run around the standards process.

              1. 4

                Funny, to me it looks like consumers are manipulated into using http as many sites don’t support https at all :)

                Additionally users are presented with no warnings when browsing over http (though browsers can show many for various issues with https). Presenting no warnings for http promotes a false sense of security. I think this manipulates consumers (most who don’t understand the problems with http anyway) into acquiescing to an insecure transport.

                I for one would love to see pressure applied to server operators who don’t offer https.

              2. 3

                That is a long, long way off and the economics of the situation are already not bad and will only get better. You’re stressing out over a memory of the past.

          2. 5

            What about the costs of SSL certificates? This isn’t going to bode well, especially for those who want to use more than one level of subdomains.

            1. 7

              The was a link in the comments to Lets Encrypt a project to provide free, automated, Certificate Authority - coming some time this year…

              1. 6

                For how many years now have people been talking about a free, non-profit CA? It’s easy to make claims about how simple and automatic this will be when it doesn’t exist yet. It seems like jumping the gun to make this deprecation decision based on the predicted future availability of something that might or might not actually materialize and might or might not be as useful and well-implemented as promised.

                1. 3

                  I would bet that Mozilla know at least a little about the CA jobs and if someone can do it, there is a lot of chance that them will do. Moreover we will see how it will be implemented just in a few month ahead.

            2. 5

              I’ve still not heard any convincing explanation of how this is going to avoid breaking resources accessed by IP on local networks (i.e. router administrative interfaces, printers, etc.). The W3C Privileged Contexts draft which Mozilla has proposed to use to determine when to require HTTPS contains exemptions for localhost but not for private IPs.

              1. 2

                Surely, the browser can check that it’s an IP on the local network segment, accept a self-signed cert on that basis, and pin it to prevent future spoofing by a third party who has access to the network.

                The present, unencrypted state of affairs actually lets anyone on your wifi observe your router password every time you log in to it, as well as everything you print. That’s really not especially ideal.

                1. 3

                  Well, a couple things:

                  The browser could do something like you suggest, but the plan that Mozilla has described doesn’t include any special treatment for local network segments (except 127.0.0.0/8). My criticism is aimed at Mozilla’s actual plan, not what it might be possible for them to do.

                  Second, even if they did decide to do as you propose, a certificate needs to be linked to a specific IP or DNS entry (or I suppose a range of them with a wildcart cert). Who will set up the CA, generate this certificate, and install it on your networked device? It can’t be the manufacturer, because they don’t know what IP or subnet you’ll have the device on or what local DNS entry you might want to use to access it. If you expect consumers to do it themselves, the likely result is that even fewer of them will bother getting access to these devices, and even more routers will be running with the default password than are today. I guess you could require every networked device to ship with code to generate a new cert every time you change its IP. That seems fragile and unlikely to work with local DNS.

                  The present, unencrypted state of affairs actually lets anyone on your wifi observe your router password every time you log in to it, as well as everything you print. That’s really not especially ideal.

                  Sure, the current situation is untenable. That doesn’t mean that this is better. It’s the politician’s fallacy: “Something must be done; this is something; therefore, we must do this.”

                  1. 2

                    It’s fair that they don’t say they plan to do this. I’m merely interested in showing that a solution is actually no big deal.

                    Certs don’t need to specify anything that they are allegedly for. The ASN.1 and signing stuff doesn’t complain if those fields are blank; it’s just that under existing validation methods, such a cert would never be trusted since it claims authority over nothing. If there’s a sensible way to only allow it for things which are sufficiently nearby network-wise, the user can click through the first time they use the device, and then it can’t be impersonated later.

                    That said, shipping every instance of a given printer model with the same cert would be bad, yes. It needs to be generated once, the first time the printer is powered on. Not all printers have internal flash, but the wifi stuff dwarfs the flash in price so hopefully that’s not a big deal, and a PROM with a factory-installed random seed could substitute if it were.

                    And, as others have said, this is a difficult complaint to sympathize with because no near-future proposal would affect printers in the slightest. When was the last time you saw printer firmware that had been updated in the last ten years? Printer admin stuff will be perfectly fine with the legacy http feature-set.

                    Edit to add: Also…. Invoking the politician’s fallacy feels kind of … strange, there.

                    I wasn’t saying, we should do exactly this plan without thinking about whether it’s bad. I was suggesting a refinement to the plan that I believe addresses your concern.

              2. 1

                In recent months, there have been statements from IETF, IAB (even the other IAB), W3C, and the US Government calling for universal use of encryption by Internet applications,

                Great, let’s do it!

                which in the case of the web means HTTPS.

                WTF?!


                This must be one of the best examples of an invalid argument!