1. 13
    1. 14

      I’m not sure what we expect chrome to do here. The ability to add a trusted root means you’ve already got control - game over.

      The alternative is to show broken SSL indications to users unfortunate enough to be caught behind DLP software, onerous corporate overlords, etc. So every site they visit has the broken red lock? How does that help anything?

      1. 9

        Indeed. The next level of escalation is that instead of installing a new root, you get to install a patched browser that never gets updated for other security issues.

      2. 2

        I’m not sure what we expect chrome to do here.

        Um, tell the user they’re being MITM’d?

        The ability to add a trusted root means you’ve already got control - game over.

        Key pins have nothing to do with trusted roots.

        1. 7

          Key pins have nothing to do with trusted roots.

          I’ve got a partition on my work laptop that ensures every HTTPS site I go to terminates at a corporate proxy that inspects and re-encrypts all traffic on the fly, and it all works thanks to a trusted root placed by IT .

          Do you think Chrome adds security by showing me that every single site I go to has a pinning exception? I know that - part of getting the work laptop means signing a big piece of paper that says I know that. And if the person who set up this interception wasn’t my work’s IT department, I’m already rooted six ways from Sunday, so I can’t trust anything that Chrome tells me anyhow.

          Is there a browser out there that does report pin exceptions when the certificate chain terminates at a user installed CA? Firefox doesn’t, Chrome doesn’t, and 7469 totally allows it.

          Key pins have everything to do with trusted roots.

          1. 1

            Do you think Chrome adds security by showing me that every single site I go to has a pinning exception?

            Yes, of course.

            Is there a browser out there that does report pin exceptions when the certificate chain terminates at a user installed CA?

            Browsers only just finished implementing HPKP. The current overriding behavior is considered a security-related bug in Firefox by Mozilla.

            Key pins have everything to do with trusted roots.

            Only in the sense that the entire point of key pinning is to not need trusted roots.

            1. 3

              The current overriding behavior is considered a security-related bug…

              …by you, but I’m not sure the rest of Mozilla agrees ;-)

              No, they really don’t. The entire point of key pinning is to not need trusted roots.

              Is/ought. They shouldn’t. But they do!

              What’s a more interesting question is how to do the UX such that a legitimately MITMed user is informed that they’re being MITMed but that the overall connection is safe (er, as safe as you can get in such a case). Jumping straight to the “this is a self-signed certificate that expired last year” UX for a user who is behind a corporate MITM proxy only serves to teach the user that the red slash isn’t really bad in every case, which is exactly what we can’t have.

              Can we do that and make it understandable?

              1. 2

                …by you, but I’m not sure the rest of Mozilla agrees ;-)

                So far as I can tell from that thread, most agree. The bug itself is about fixing the broken default behavior.

                Jumping straight to the “this is a self-signed certificate that expired last year” UX for a user who is behind a corporate MITM proxy only serves to teach the user that the red slash isn’t really bad in every case, which is exactly what we can’t have.

                Can we do that and make it understandable?

                Sure, the bug I linked to makes one suggestion to that effect, as does the post above, i.e. indicate the connection is being monitored.

                1. 3

                  Not phone posting now, so I think I can have an actual conversation :)

                  As I see it, browsers have to deal with a spectrum of possibilities:

                  • A user installed software they want to use to MITM traffic, understanding the implications - for debugging purposes, for virus scanning, etc.
                  • A user is forced by their employer to MITM their traffic, and doesn’t like it, but accepts it.
                  • The user’s OEM adds MITM-ing software without the users knowledge. Whether it’s (nominally) good or Superfish doesn’t matter, the user doesn’t know.
                  • A virus/evil hacker/state actor adds MITM software.

                  So what’s your solution to this problem? Your bug report mentions making a separate browser, which I doubt will happen. Whatever we roll out has to be cognizant that there are legitimate uses of SSL MITMing, and has to inform the user without adding security fatigue/confusion - we’ve barely trained people to stop clicking past the big red “this is horribly broken” screens.

                  It’s easy to say “we need to tell the users” that their connection is being MITMed. You keep saying it. How?

                  FWIW, I’ll give my thoughts: I’d like to see Mozilla and Chrome flip to strict HPKP - as some suggest in the bug I linked - but there clearly must be a configuration override to fall back to a locally defined trust anchor. That makes OEMs take an active step towards weakening security; hopefully that’d be enough.

                  1. 1

                    It’s easy to say “we need to tell the users” that their connection is being MITMed. You keep saying it. How?

                    Answering your question even though you are trolling at this point: How? By not deceiving users? Maybe being honest instead? And not in obscure fineprint where even developers won’t see it.

                    1. 1

                      I don’t really understand your position. Except for the first of those four cases (where the user is setting up their own MITM), all of those named parties will be perfectly happy to modify the browser as needed until it doesn’t point out the issue.

                      I get the impression from one of the things you linked to that you may think that there is a distinction between changing the root certs and installing malware. There is no technical distinction - these things already generally come with hidden software that reverts any attempts to change the certs. I don’t believe that there’s any distinction in the minds of people doing it, either.

                      Certainly the OEM and state-actor cases are not going to be defeated by technical measures when they already have control, so this really comes down to the employer-as-MITM case and whether robust hidden MITMing is too expensive or too psychologically unpalatable to that group. I think it should be clear from looking at existing practices that it’s neither.

                      1. 1

                        A pin is not a pin if it is unpinned without the user’s consent. It is not “certificate pinning” in other words.

                        all of those named parties will be perfectly happy to modify the browser as needed until it doesn’t point out the issue.

                        It is a violation of the CFAA to install malware on a user’s machine. If it is not prosecuted in the company’s parent country (i.e. the USA), then it will be prosecuted in the EU.

                        Simply designing their browser to not lie will result in an immediate—real—massive security improvement for all users.

                        1. 1

                          I mean, that should have happened already then.

                          1. 2

                            I mean, that should have happened already then.

                            There’s a big difference in terms of “ease of prosecution” between something that has some amount of plausible deniability, and something that has zero plausible deniability.

                            Also, it’s possible it will happen eventually. Maybe Clapper will even be indicted at some point, and maybe we might even see justice for those our country has tortured.

                            1. 3

                              Well, I certainly want to hope for that. It’s a valid point, thank you for mentioning it.

      3. 1

        I can never tell if the people outraged by this are disingenous or stupid.

        1. 1

          Neither can I (although I’d use a kinder word than “stupid”, anyway), but it doesn’t accomplish anything to speculate about that.

    2. 5

      Would you also like Chromium to check the hosts file and list warnings for every hosts file entry? What about if you’re on a VPN should it yell about any connections going through the VPN?

      At some point it’s no longer the responsibility of the browser.

      1. 2

        Would you also like Chromium to check the hosts file and list warnings for every hosts file entry? What about if you’re on a VPN should it yell about any connections going through the VPN?

        From a technical and security standpoint (the only one that matters here), that makes zero sense. Pretty much off-topic to this situation.

      2. 1

        You don’t need to worry about VPN/hostfile modification if you’re using TLS and you haven’t had an extra root CA added. That’s sort of the point.

        1. 1

          In theory, but we know that CA’s have given out bad certs in the past. And on top of that if you want to make an argument that the browser should be warning you that you’ve added custom root certs every time one of the added ones gets used, then you could make a similar argument that if you are connecting to a host who’s IP has been altered by a hosts file entry or going over a VPN or something else then even on HTTP the browser should be warning you. My point stands; should the browser be checking these other things?

          My point being that if you say that the browser should be checking for changes to your local machine and warning you about them, that opens up a whole new box of stuff that browsers traditionally just say “Ok past this line I can no longer really be sure of things since I myself as a piece of data on the client machine, may have been altered.” and don’t concern themselves with.

    3. 1

      I think this is similar to GMO labelling. On the one hand, pin is actually ignored, so it seems no brainer to say pin is ignored. On the other hand, value of additional information is probably low, like GMO labelling.