These posts are starting to get physically painful for me.
First, these aren’t even remotely backdoors in any sense and I don’t think people know what that word means anymore. A backdoor is something that is purposefully put into the code to compromise security of the software. Dual_EC_DRBG was a backdoor.
Then they go on to talk about metadata through Gboard, Maps, etc. How is that relevant to Signal? This person is complaining about Google Play and Google, not about Signal. And none of his “solutions” actually solve any of these problems. The GCM problem (which is the main problem here) has been talked about again and again and again on the github and irc, and no one who writes these posts seems to read those. Maybe then the author would realize that GCM is the only way to handle certain sleeping mechanisms that Google added to save battery (that’s a different argument and agree that it’s a bad idea) and get messages in that mode. They also mention LibreSignal, again not bothering to even look to see why it’s probably going to be defunct.
The author also mentions that TLS has forward secrecy so you could just use that! I almost threw my mouse. Double Ratchet documentation and paper make it painfully clear about why it exists and why it adds strength to Diffie-Helman in TLS and most protocols, but apparently it was too much for the author to look at. That is also the transport layer crypto and has nothing to do with the end-to-end crypto for messages themselves. It’s supposed to work even if you DON’T have TLS.
Posts like this are legitimately dangerous and are perfect examples of disinformation potentially leading people to making hasty mistakes; “Oh this blog post says that Signal has a backdoor, better not use it!”
EDIT: Somehow I missed this:
you can just use XMPP over TLS even without OTR/PGP/OMEMO (which all are crypto systems on top of XMPP) and reach the same level of secrecy as with Signal.
Apparently the author doesn’t even understand what end-to-end encryption is for…
Additionally, Google Play Services can do a ton of stuff that has nothing to do with Signal or any 3rd party app. If Google wanted to backdoor Signal, they wouldn’t have to bother, they could just send stuff through the Google Play Services connection that is already established between the phone and Google, silently install anything they want that would run with elevated privileges, and control the entire phone.
It’s correct that Google apps have some “elevated privileges”, but it is wrong that these are at a level where Google can just do everything. Without the mentioned backdoor, Google would not be able read Signal’s private storage, because even Google’s apps are sandboxed properly on Android (the same way other apps are, just with some additional privileges, to e.g. install updates in background). There is also the common misconception that Google can just do a system update and thus break out of the Sandbox. This would work on Nexus and Pixel devices (where Google is the one to provide updates), but not on other devices, because manufacturers build AOSP, only adding some configuration, libraries and APK packages from Google to the build system. If Google wants to plant a backdoor, it’d require cooperation of the manufacturer or it would need to go into the AOSP source code, which is open source and thus very unlikely to happen.
He states that in the article even. So I don’t particularly understand what he’s trying to blame on Signal here.
You might have noticed that I intentionally put this term in quotation marks.
The term is not well-defined, so it’s up to you to argue if silently allowing arbitrary code execution to a third-party is a backdoor or not. Your example is also a good one to prove that the term is not very well defined: while Dual_EC_DRBG is mentioned as a backdoor in the english wikipedia, the definition in the german wikipedia does not include it (and consequently it’s not mentioned in the article), because it is said to require unauthorized access to a (part of a) computer, which is not the case for crypto backdoors.
GCM is the only way to handle certain sleeping mechanisms
Just wrong. Conversations (that uses XMPP and has no hard dependency on GCM) is an example, Wire (which has the ability to disable GCM and use WebSockets instead) is another. Android allows applications to request the ability to run in background the same way GCM does. GCM has no special privileges, it’s just a normal app. You can see that on current Android versions by going into system settings -> battery -> overflow menu (three dots top right) -> battery optimization. You can see that some of the apps mentioned there are excluded from battery optimizations and you can decide to add others. It’s not a special feature only available to GCM.
TLS vs e2e
I do know the difference between e2e and transport layer encryption. And I don’t agree that you always need both, there are even cases where you don’t need any. Transport layer encryption was invented with the idea in mind that only the network the data is passed through is not trustworthy, with e2e encryption, you add the servers or even the clients to the list of possibly untrustworthy parties. However if you can trust all servers enough to show them the data unencrypted, then you don’t need e2e. Easy example: Assuming you have a corporate e-Mail server, that is 100% secure (= it doesn’t exist) and set up properly. The company consists of exactly two persons, both use secure TLS to connect to the server and both have 100% secure clients. If one sends an e-mail to another using that server, there is really no reason to use e2e in terms of secrecy or authenticity. You can even add another server to this scenario as long as you trust the other server enough. So if each client has it’s own server that is as trustworthy as the client itself, and you set up proper TLS between all parties, you have proper encryption even without e2e - and this is the example I wrote with XMPP.
We however still need e2e, because we have large untrustworthy players in federated networks (GMail), or even unfederated networks (Signal). We need e2e because we can’t trust our service providers enough to have them access to our data. Instead of using e2e to give the service providers only metadata, we should better don’t use them if possible.
(The comment on TLS was just to show how overrated the forward secrecy feature on Signal is, because the easiest to attack is not the network link anymore, it’s the device itself)
First, thanks for coming here to defend your stance. Even if the term is not well-defined (which is arguable itself) the “backdoor” is not in Signal itself, but in the requirement to use GCM and Google Play Services. Maybe instead of specifically targeting and calling out Signal you should be arguing that any android phone with Google Play services installed is vulnerable to manipulation by Google. I see absolutely no reason why you called out Signal specifically. I didn’t look at wikipedia, I just have familiarity with auditing real world crypto systems and Dual_EC_DRBG is the one that comes up the most often as a intentional and malicious backdoor by an organization.
I may be wrong about Doze and GCM. The last time I had a discussion about this was when the LibreSignal fiasco was happening and Android 6 had no way to deal with whitelisting. Google was very much trying to push developers to use GCM instead of opening a bunch of websockets, for battery reasons. It even came up in the issues section of Conversations. I’m glad they added that option.
there are even cases where you don’t need any
Oh boy. If you actually think that there are cases when it’s not a good idea you are almost certainly going to shoot yourself in the foot in ways that you don’t expect. I can’t tell you how many times I’ve been on a penetration test and a client with this thought process decided not to encrypt their network share, use telnet, etc. because “it’s on an internal LAN”. That thought process is the primary reason that Google infrastructure team encrypts literally every connection and treats every connection as potentially malicious.
Transport layer encryption was invented with the idea in mind that only the network the data is passed through is not trustworthy, with e2e encryption, you add the servers or even the clients to the list of possibly untrustworthy parties.
And with TLS if you aren’t pinning every single cert and doing mutual authentication you also have to add the endpoints, every single CA in your trust store, that your organization doesn’t do TLS truncating, etc. TLS is very much more reliant on trust models, and that makes it very poorly suited for this type of use case.
Assuming you have a corporate e-Mail server, that is 100% secure (= it doesn’t exist) and set up properly. The company consists of exactly two persons, both use secure TLS to connect to the server and both have 100% secure clients. If one sends an e-mail to another using that server, there is really no reason to use e2e in terms of secrecy or authenticity.
This is the absolute perfect example that you could have given me that you don’t understand the use case for e2e. In fact I ran into this exact argument on a penetration test last week. In your example if the mail server is compromised or there is a rogue administrator then they have full access to plaintext emails and there is no more secrecy. If those people had also used GPG in conjuncture with TLS connection to the server the other adversaries would have nothing but encrypted messages. And further, if the connection wasn’t secure and they were still using GPG, they have a much higher likelihood of detecting tampering and knowledge that their messages still aren’t readable. This is not “proper encryption” because there is zero mutual authentication between the clients, again that is the entire point of e2e cryptography.
The comment on TLS was just to show how overrated the forward secrecy feature on Signal is, because the easiest to attack is not the network link anymore, it’s the device itself
I wish people would stop making this argument. Yes, it’s often easier to attack the client, but that doesn’t mean that you aren’t being attacked at BOTH levels (ie sslstrip, which ironically was created by the author of Signal).
My biggest gripe with this is that the title feels very much like clickbait and fear mongering, instead of simply postulating that using Google services makes you vulnerable to trusting Google with access to your phone you decided to specifically call out Signal and jump on the train of “X service is backdoored” without actually substantiating anything about Signal itself. If you have Google Play Services installed Conversations has the exact same “backdoor”.
Generally I learnt a lot from writing this blog post (not directly related to your post):
1. do not write about multiple issues in one post, people will use everything they find to argue why the whole post is invalid.
2. do not assume people have any background in the system you are talking about or willing to use a search engine to have basic knowledge before ranting against the post.
3. do not try to correct people stating wrong things in the internet. People don’t read your other posts on other sites and thus you end up repeating over and over again.
4. do not assume people actually read the whole post before answering it is bullshit.
For following please ignore everything I wrote in the blog post related to GCM, because I understand that people know about the GCM thing, and moxie is aware as well. The backdoor mentioned in the title and tl;dr is NOT the GCM issue (which can hardly be called a backdoor, but I wrote this in the post as well).
I see absolutely no reason why you called out Signal specifically
The reason for me to mention Signal specifically is: People assume it is secure and properly audited and I wanted to say, no it’s not. If you know about the android sandboxing/security system and read the post carefully, you could know that the issue is not the existence of Google on the device itself, but how the Google code is used by Signal through the play services client library. It is only about the Maps part, not the location picker and not the gcm, only the maps part. This means that Conversations, that has the optional feature to use GCM and thus includes some play services related code as well is NOT affected (or at least I found no indication for that).
I think that the security standard for a crypto messenger (which Signal is advertised to be) should be higher than the usual app. A news app that includes Google Maps and thus has this backdoor as well is not worth the write-up. But a specific portion of code in Signal that allows Google to snoop into your messages is and can be removed rather easy (if responsible people would actually care) is worth it.
Don’t understand me wrong, the Signal protocol (to be precise: the crypto part) is properly audited, but not the app as a whole. And as I outlined in the blog post, it is nearly impossible to do a proper audit as some proprietary code is included. Nobody else ever wasted time on it, and now as I did look into some (=very few) parts of it and found some issues I’d call serious, people start telling me that “if Google is the attacker, it’s a lost game, because they have root on the device”, which is completely bullshit (again this is not directly attacking related to you).
Puh. I came up with an example that contained a 100% secure machine and you come and tell me that the system is insecure because you can attack that machine. Well it’s the definition of 100% secure that you can’t attack it… Of course I know this example is solely theoretical, but that still does not make your point valid.
I’d also like to go the other way round. If you own a server (which might be the case, but we can also just assume it to be), do you use it as an IRC bouncer, would you consider using XMPP MAM (a feature that allows you to sync the history of your messages across your XMPP clients even when they were not there when the message was sent, which would be a requirement for Signal based systems), would you consider using a usual imap server such as dovecot, that does not encrypt your messages in a way that the server can’t access them (there is a plugin for that)? I bet you would. Because it really does not make a difference if you a) have the data decrypted at your server and store it only there or b) have the data decrypted at your client and store it only there. In both cases I need to hack into that machine can hacking into the other one won’t give me access to the data.
Of course, as an attacker you can try to strip tls, you can own a CA and provide wrong certificates etc. But all these issues can be combated client side, you just need to do it.
And finally you can also use TLS for e2e crypto, if you are able to somehow establish a p2p link. Both TLS and Signal protocol are crypto protocols that can be used e2e or only on a single physical or logical network link.
Yes there are. And if you say you encrypt all your communication, you are just lying. Or do you encrypt when writing letter that you hand in personally? Do you encrypt the signal between your computer and your screen? Do you encrypt when accessing the configuration interface of your personal router at home? Do you encrypt when copying files from your mobile device to your computer via usb cable? …
I personally use TLS and e2e (mostly GPG) and often enough I use both at the same time. But that doesn’t mean it’s always necessary. Of course I have communication where I want to contact people and there is some untrustworthy third party in between, so I need e2e, and I need TLS nonetheless to not show the whole world who I am communicating with. When I am writing seriously important stuff, I might end up using crypto I would not use otherwise. But for my everyday communication with my colleague TLS to a trusted server we both use is really enough. If it’s not, you should probably stop talking in the public…
Also Guardian posted a “backdoor” about WhatsApp that is very much like your argument. It was so fear mongering that there is an Open Letter about how this exact sort of post is actually dangerous. It has quite the roster of signatures; Bruce Schneier, Matt Green, Matt Blaze, etc, etc. I highly suggest you read this.