There’s no compelling reason encryption and authentication have to be tied together. Mandatory authentication is theater, sort of (using the existing CA scheme it is, at any rate), but mandatory encryption has benefits even without authentication (such as preventing passive monitoring).
I wobble around in my support for unauthenticated encryption. I think a reasonably consistent viewpoint is that it’s equivalent to no encryption. You don’t get to pick your adversary and say “no interception, only monitoring please”. On the other hand, if it’s transparent to the user and not advertised as a secure connection, seems pretty harmless.
It’s true that adding encryption increases attack surface (more code, more surface, simple as that), but I think that’s mostly a danger because of over complicated exchange protocols. 2014 was a bad year to be running a TLS enabled server, but that’s TLS, not really crypto to blame.
If http2.0 threw away TLS and started over with a simple curve25519 handshake/exchange, I would really get into that.
Interception is much harder and much dicier for any adversary than passive monitoring, though. It takes vastly more resources, and anybody cross-checking both ends of a connection you’ve attacked will almost certainly notice you. Making the NSA’s datacenter taps unviable is definitely a plus.
It’s not a panacea, of course, because it does nothing directly to prevent MitMing and does introduce additional attack surface (hi, OpenSSL!), and figuring out how to signal this to users is hard. (My solution would be to mandate encryption and signify site authentication to users. Then encrypted/unencrypted is irrelevant and we needn’t mention it, because the protocol only has one encryption state, and authentication can indicate more or less what HTTP vs. HTTPS indicates now.) But I definitely do think the benefits outweigh the costs.
I can think of a wide variety of potential improvement to HTTP2, but I suspect “stop existing, because why are we pandering to the ridiculous browser-as-OS folks” is probably a non-starter with the people who matter. =)
I think the most damaging aspect of the NSA revelations is that now that’s the only adversary people can think about. I’d say you’re far more likely to be taken by a rogue hotspot in the local coffee shop (park, airport, etc.) and the person running that hotspot really doesn’t give two shits about getting caught.
Not an argument against encryption per se, but any argument that’s basically “because NSA” is incomplete, perhaps even dangerously misleading, and needs to consider more factors. In this case, “the NSA doesn’t want to get caught” doesn’t mean other people don’t want to get caught, so be sparing in your trust of a security model that assumes “don’t want to get caught”. Rogue hotspots don’t “want” to get caught, but they don’t care about detection nearly as much as the NSA.
Philosophically, politically, existentially, the NSA may be the top priority, but even in light of everything that’s happened, I think common street criminal should remain the focus of our threat modelling. A good security model will work against both criminals and the NSA without making assumptions about what each is willing to do.
Mandatory encryption doesn’t just hit the NSA (or even just state-level attackers), e.g. firesheep would have been prevented by it. Encryption alone isn’t enough. But as long as it’s cheap (it is), effective (it is) and safe (god save us from TLS, but in principle, it is), it’s better than plaintext.
Regarding firesheep, that’s only because the author didn’t bother to include arp spoofing. If interception of traffic had been necessary, it could have been trivially added. In a world of unauthenticated encryption, firesheep would have grown by a few hundred lines, but still been possible.
Sorry to belabor the point. I’m a recent convert from the church of opportunistic encryption. Now I feel the need to share my enlightenment with everyone. :)
Now, to be nice, I’ll give you what I think is actually a more compelling reason for OE, at least as things stand today. Protection from your ad injecting ISP (comcast, ATT, verizon). While they certainly can MITM your traffic, they have, so far, been reluctant to mess with https because they’re not quite set up to distinguish what https is authenticated and what’s not. That’s probably because 99.9% of https traffic is authenticated, so there’s little to be gained. However, I suspect the balance will change if things change and all traffic is encrypted, but only 50/50 authenticated. Suddenly, I think they will develop techniques to uncover which is which, and start intercepting all the “self-signed” traffic, and we’re back where we started.
There’s no compelling reason encryption and authentication have to be tied together. Mandatory authentication is theater, sort of (using the existing CA scheme it is, at any rate), but mandatory encryption has benefits even without authentication (such as preventing passive monitoring).
I wobble around in my support for unauthenticated encryption. I think a reasonably consistent viewpoint is that it’s equivalent to no encryption. You don’t get to pick your adversary and say “no interception, only monitoring please”. On the other hand, if it’s transparent to the user and not advertised as a secure connection, seems pretty harmless.
It’s true that adding encryption increases attack surface (more code, more surface, simple as that), but I think that’s mostly a danger because of over complicated exchange protocols. 2014 was a bad year to be running a TLS enabled server, but that’s TLS, not really crypto to blame.
If http2.0 threw away TLS and started over with a simple curve25519 handshake/exchange, I would really get into that.
Interception is much harder and much dicier for any adversary than passive monitoring, though. It takes vastly more resources, and anybody cross-checking both ends of a connection you’ve attacked will almost certainly notice you. Making the NSA’s datacenter taps unviable is definitely a plus.
It’s not a panacea, of course, because it does nothing directly to prevent MitMing and does introduce additional attack surface (hi, OpenSSL!), and figuring out how to signal this to users is hard. (My solution would be to mandate encryption and signify site authentication to users. Then encrypted/unencrypted is irrelevant and we needn’t mention it, because the protocol only has one encryption state, and authentication can indicate more or less what HTTP vs. HTTPS indicates now.) But I definitely do think the benefits outweigh the costs.
I can think of a wide variety of potential improvement to HTTP2, but I suspect “stop existing, because why are we pandering to the ridiculous browser-as-OS folks” is probably a non-starter with the people who matter. =)
I think the most damaging aspect of the NSA revelations is that now that’s the only adversary people can think about. I’d say you’re far more likely to be taken by a rogue hotspot in the local coffee shop (park, airport, etc.) and the person running that hotspot really doesn’t give two shits about getting caught.
Not an argument against encryption per se, but any argument that’s basically “because NSA” is incomplete, perhaps even dangerously misleading, and needs to consider more factors. In this case, “the NSA doesn’t want to get caught” doesn’t mean other people don’t want to get caught, so be sparing in your trust of a security model that assumes “don’t want to get caught”. Rogue hotspots don’t “want” to get caught, but they don’t care about detection nearly as much as the NSA.
Philosophically, politically, existentially, the NSA may be the top priority, but even in light of everything that’s happened, I think common street criminal should remain the focus of our threat modelling. A good security model will work against both criminals and the NSA without making assumptions about what each is willing to do.
Mandatory encryption doesn’t just hit the NSA (or even just state-level attackers), e.g. firesheep would have been prevented by it. Encryption alone isn’t enough. But as long as it’s cheap (it is), effective (it is) and safe (god save us from TLS, but in principle, it is), it’s better than plaintext.
Regarding firesheep, that’s only because the author didn’t bother to include arp spoofing. If interception of traffic had been necessary, it could have been trivially added. In a world of unauthenticated encryption, firesheep would have grown by a few hundred lines, but still been possible.
Sorry to belabor the point. I’m a recent convert from the church of opportunistic encryption. Now I feel the need to share my enlightenment with everyone. :)
Now, to be nice, I’ll give you what I think is actually a more compelling reason for OE, at least as things stand today. Protection from your ad injecting ISP (comcast, ATT, verizon). While they certainly can MITM your traffic, they have, so far, been reluctant to mess with https because they’re not quite set up to distinguish what https is authenticated and what’s not. That’s probably because 99.9% of https traffic is authenticated, so there’s little to be gained. However, I suspect the balance will change if things change and all traffic is encrypted, but only 50/50 authenticated. Suddenly, I think they will develop techniques to uncover which is which, and start intercepting all the “self-signed” traffic, and we’re back where we started.
Man, no kidding. Spin up ephemeral keys, sign them with EdDSA and you’ve got PFS too.