re: age - the author calls out in the post that there are two implementations - (age and rage) which is cool. But for me, the most important aspect is that age actually has a specification: https://age-encryption.org/v1 - so technically anyone can implement age in any language in which the crypto primitives are available. And once you are done, you can validate your implementation with: https://github.com/C2SP/CCTV/tree/main/age
This is definitely a huge plus, I helped write a Kotlin/JVM implementation for use in Android Password Store and we got great value out of the standardized test suite which paired nicely with JUnit’s dynamic tests feature to give us robust coverage for very little code.
I am aware that you have retired from the Android Password Store project. (Thank you for your work on it. I use it everyday.) But did age support make it into the app before that?
The one thing @soatok does not address at all is the original use case that PGP was designed for, as opposed to all the other ones people started using it for in time between when it was designed and when better options came along: encrypted email.
The closest they get is saying to “use signal” for secure messaging. But I do not trust signal and none of the reasons relate to the need to give them my phone number. And while @soatok correctly points out that my phone number need no longer be revealed to people who want to communicate with me, it still looks like signal-the-org demands that I give it to them.
If you want people to stop using PGP, you need to provide an alternative for its original purpose. And an alternative that requires me to send all my stuff through one org, even if I didn’t need to give them my phone number (and it looks like I do) is not adequate.
I know S/MIME addresses some of the concerns one might have about PGP email, but I wouldn’t say it’s better on balance, and I don’t think S/MIME as implemented in those mailers that still implement it addresses any of the criticisms from this article in any way.
Every time I see an article like this, I get optimistic that there will be a reasonable proposal for encrypted email. There never is.
Aside from Signal, Soatok seems to address this use case (private messaging) from a different direction:
In the near future, I plan on developing end-to-end encryption for direct messages on the Fediverse (including Mastodon). This is what motivated my work on the Public Key Directory to begin with.
I see where they are coming from, but I find “don’t” an unacceptable answer to “how do I encrypt email?”
I am optimistic that work on the fediverse angle could yield an acceptable answer for email style asynchronous encrypted messaging, I’ll certainly be paying attention there.
But for now, I’m more comfortable trying to train a small number of collaborators on s/mime opsec than I am trusting signal. That’s because I know the pitfalls of s/mime better than I know the pitfalls of PGP. But I’d still be more comfortable trying to train a small number of collaborators on PGP opsec than trusting signal.
I see where they are coming from, but I find “don’t” an unacceptable answer to “how do I encrypt email?”
My understanding is that to get to the point of asking about whether you should encrypt email, you first have to make the fundamental mistake of trying to use email for secure messaging. And so the answer really is not “don’t encrypt your email”, it’s “don’t use email for secure messaging”.
All attempts to do encrypted “secure” email are basically just theater. There are so many weak links, technical and social, in the email infrastructure that you can personally set up the most bad-ass hard-ass ultra-mega-NSA-mil-spec XP++++ Pro grade encryption on your end, but any recipient who sees the plaintext can still quite easily quote-reply with the plaintext, or forward it, etc.
So PGP is more the cherry on top of the fecal sundae that is trying to make email into an acceptably secure messaging system. Which is why the answers always try to redirect you to an actual secure messaging system.
So PGP is more the cherry on top of the fecal sundae that is trying to make email into an acceptably secure messaging system. Which is why the answers always try to redirect you to an actual secure messaging system.
Should I say TLS is the cherry on top of the fecal sundae that is trying to make TCP an acceptably secure transmission system, just because there are a whole pile of ways that one party or another can terminate TLS in the wrong place and turn it into security theater? Back when I used to do pen tests, one particularly fun trick was to cause Firefox or IE to log its session keys to a place where I could grab them. It was at least as good as most of the email fun you could pull off, and got you a lot more data.
I opened the bug on the Evolution (GNOME email client) tracker about how it silently turned off encryption on encrypted email messages when you replied, even if you had a public key for everyone else on the thread. I am 99% certain it took over 10 years for that to get fixed. (I believe it was over 15, but I can’t find the notification from when it was finally closed on this laptop.)
I viscerally appreciate what you’re saying about email not being fit for this purpose.
But until we get a modern answer with the properties of email that doesn’t involve me trusting signal or whatsapp, it’s easier for me to work around the deficiencies of email in this regard.
I’ve tried twice now to set up a secure, closed messaging system, which is an easier version of what I’m saying I want here. I tried once using email infrastructure and once using XMPP infrastructure. Email was smoother. I’d like a better alternative.
But until we get a modern answer with the properties of email that doesn’t involve me trusting signal or whatsapp, it’s easier for me to work around the deficiencies of email in this regard.
Leaving your door wide open because you don’t like a particular brand of lock does not strike me as a great security stance.
But then I also don’t know why you have issues with “trust” with Signal, which is open source.
Isn’t the entire point of client-side end-to-end encryption that it shouldn’t matter what the server is doing?
And the whole binary thing is kind of endemic to the mobile ecosystem for regular users (i.e., people who have never heard of F-Droid outside of possibly some spicy fanfics).
Who cares if their server is doing key distribution? Surely you’re not trusting any server’s key distribution, federated or not, if you really care about message security?
but if the server were to provide the wrong public key, it would still be detectable right? can’t you still compare your public key to the public key that your contact has stored for you, if you are with them in person? wouldn’t that relieve you of the need to trust signal’s servers?
short of manually comparing keys, if you trust the client software on both ends, then exchanging messages and verifying in person that they were received should confirm that you have the right public keys on file, right? so that’s another way that you would not have to trust the server.
probably never. I just think we should be careful about saying “you can’t” exchange public keys without trusting Signal’s server. besides, is WhatsApp’s “key transparency” any different?
Yes. The security model of “key transparency’ is that you can build automation that detects when an errant public key has been issued, or if the public key received isn’t in the ledger. You don’t have to rely on squishy humans to do anything.
Additionally, it creates a disincentive for the kind of nation state actors that might perform such an active attack against a target of interest. Whereas we see mechanisms like National Security Letters with a built-in gag order, key transparency prevents stealth. In order to attempt to attack the other person’s communications, you have to first announce yourself to the network.
(This guarantee is much stronger with open source software and reproducible builds, of course.)
Nope. The PGP key server can lie about a lot of things. There is no audit trail (i.e., Merkle Tree proof of inclusion) to guarantee a time period. There are no witness co-signatures. There’s nothing.
@caleb, I’m currently visiting family for a much-needed vacation from technical work. Whether something exists or not is something you’re free to explore on your own. As I have no financial stake in anyone believing anything, I’m going to prioritize spending time with my nephews and drinking enough eggnog to make my physician give me the angry teacher stare when I return in January. I wish you the best.
I’m replying to you here (not the user you’re replying to, but I was interested when I saw you bump it), because it turns out that’s actually the max reply depth and so no-one can reply to you.
I thought the server was open source? (AGPL-3.0 license): https://github.com/signalapp/Signal-Server Seems to get actively published to, at the time of this writing the last commit was 5 hours ago.
I seem to remember some people have verified the client binaries for Android at least once. I can’t seem to find a link for it though, perhaps my memory is faulty.
The server is open source except for the spam filter, which is pretty reasonable. I believe all the other server-side components, particularly SGX enclave implementations, are also open source (though I didn’t double-check).
Leaving your door wide open because you don’t like a particular brand of lock does not strike me as a great security stance.
I’m not leaving the door wide open. Like I said, it’s a closed system that we’re building on email protocols. We use filtering to prevent the kinds of accidental leaks you’re discussing in other messages. We hand out specifically configured clients. And we use servers configured to block plaintext messaging. We tried something similar with XMPP, but it was easier on the people who were responsible for infrastructure using email things, and people using the system liked the email interface better.
But then I also don’t know why you have issues with “trust” with Signal, which is open source.
One of the reasons is because in order to sign up, I have to surrender a phone number, and they are not open enough for me to tell what they might be doing with that. There’s no way for me to see what they’re actually doing on the server side so I can infer it, either.
Email requires you to “surrender” huge amounts of useful information – enough that the secret police might not even need to decrypt the body of the messages to get enough evidence to convict you.
But sure, a phone-based tool that initially used phone numbers (disposable ones are easily available, or so I hear) to avoid the need for centralized storage of things like contact lists and personal data is the real risk, and email is just fine an peachy and dandy. Now let me just go see who’s knocking so loudly at mydoor…
But I can run my own email server. And it’s trivial to give an account on there to anyone whose phone number I have (which is how you found people on Signal, last time I tried it…)
I’m not worrying about surrendering info to myself.
Thanks for the thought exercise. I would probably consider signal a great replacement for email if I could stand up my own server and have my contacts connect their signal apps to it. I suspect that’s a smaller lift now than it used to be, since they’ve started publishing server code again, and might hopefully even simply be a skill issue at this point.
So you can do your own, more complicated and more easily seized thing that still likely leaks plaintext metadata to the secret police, and you think that’s better than something that uses a phone number – which you can easily obtain as a disposable resource – to bootstrap itself without requiring a centralized account system.
which you can easily obtain as a disposable resource
I have admittedly not yet tried very hard, but every place around here that I’ve checked requires a government ID for this.
Looking through the documentation on signal’s site, I don’t see anything explaining why they need it, nor making me expect that it’s disposable. Don’t I need to keep that number as long as I have my signal account?
They do a great job covering the crypto, which is good. Their privacy documentation does not explain, at least in any place that I’ve yet found, what they’re doing with my phone number and how they prevent it from being metadata associated with my encrypted messages.
more easily seized thing that still likely leaks plaintext metadata to the secret police
Signal and their infrastructure is in the same physical jurisdiction I’m in. So police that are interested in me can go after things in Signal’s hosting facility. I can host my own stuff in a different physical jurisdiction, where it’d be less easily seized than Signal’s. That is part of the appeal of self-hosting.
And our police aren’t secret. There are just signs that they’re becoming more worrisome.
Our conversation has persuaded me to continue learning about signal. I know you’re a serious person, and your technical judgement often aligns with mine, so I want to see if it’d work for what I’m putting together.
A thing I’m finding missing from the signal software is the ability to use hardware tokens for crypto. I’d like to have my private keys stored on an external token, and I don’t see a way to do that even if I spend time learning to self-host signal. It looks like that just isn’t a feature they offer, even for their desktop software.
It’s starting to look like, even if my trust issues could be resolved, what I want to do is simply more complicated than what signal supports at the moment.
I’d love to contribute to building something that uses signal’s crypto and protocols with hardware keys at the endpoints, an interface and behavior that’s more like email than SMS (particularly as regards attachments), coordinated on a self-hostable server.
But I need to get this bootstrapped at an in-person meeting that will take place between mid-December and late January. It’ll be a “party” where we distribute tokens that I’ll have sourced, generate keys, configure wireguard and FDE on everyone’s PCs, connect via wireguard to an email server which only accepts traffic over that wireguard connection, doesn’t relay traffic to other email servers, and refuses all plaintext mail.
I don’t think the thing I’d like to contribute to can be ready in that timeframe, and I don’t see where it’s been built already for me. I do think I can get the server I’ve described hosted in an appropriate place and configured.
I don’t think we can have a productive conversation about this because I fundamentally don’t think your apparent preference for spending huge amounts of effort kinda-sorta securing a thing that’s fundamentally un-securable is something that I can engage with or talk you out of. The lengths you’re prepared to go to in order to avoid the possibility of “just use a burner phone” frankly astound me.
We probably can’t. Your insistence on dismissing features like “hardware tokens for private keys”, “richer messaging than SMS (or, we need real attachments)” and “hosted in another jurisdiction” as if it boils down to “not wanting to use a burner phone” means we’re talking past each other.
Your confidence that reasonably anonymous burner phones can be had here is also, curiously, much higher than mine.
But thank you (sincerely!) for discussing it this far.
what makes @hoistbypetard’s proposed system “fundamentally un-securable” and why don’t the same fundamentals apply to Signal? it sounds like his proposal can be more secure because keys and software is set up in-person. aren’t the fundamental limitations more on the Signal side because they are targeting remote key and software distribution, which is much harder to secure? I would be really curious if you could unpack this to any degree.
anyone whose phone number I have (which is how you found people on Signal, last time I tried it…)
This was covered in the article. Adding someone on signal only requires a username these days. A phone number is required to sign up, which is damn easy to buy with cash in most parts of the world.
That’s good. Do you know why signal still needs a phone number from you, though?
Last time I tried to buy a phone number with cash here, the person selling it was pretty insistent on having a copy of a government-issued ID. This was on the east coast of the US. Had I cared more in that case, I think I very likely could’ve got one somewhere else without showing ID, but the ID requirement and the cameras all around the store didn’t make being able to use cash feel like it improved anonymity in any meaningful way.
For my application, I don’t actually mind giving my phone number to the people I’m messaging, and vice versa. We’re all within 2 degrees of each other IRL. I am not yet comfortable with signal having mine and my contacts’ numbers associated with each other.
you can personally set up the most bad-ass hard-ass ultra-mega-NSA-mil-spec XP++++ Pro grade encryption on your end, but any recipient who sees the plaintext can still quite easily quote-reply with the plaintext, or forward it, etc.
Exactly the same is true of signal though. Someone could screenshot your message and share that. If “the recipient could choose to willingly leak the plaintext” is in your threat model, there’s not much you can do.
In the case of Signal, the recipient has to choose to leak it and take active measures to do so.
In the case of email, it’s far too easy for the recipient to accidentally leak it, because they might not have the same super-ultra-opsec config the sender does.
Signal pushes people onto phones, which are more likely to be used in public places where people can see your messages over your shoulder.
Signal also support iOS where the binaries cannot be verified against the published code, so E2EE doesn’t mean much there.
The “just use Signal” mantra also doesn’t help because people are given a false sense of security. With email at least there is a general awareness of the need to be careful and consult a nerd if you actually want to protect your communications.
In the case of Signal, the recipient has to choose to leak it and take active measures to do so.
Smartphone users take screenshots of text all the time without thinking twice, and default smartphone configurations increasingly upload every file to the cloud automatically.
Excuse me, but I find some of these arguments quite weak.
It is true that Signal pushes people onto phones, but if other people glancing at your phone poses an actual threat, the problem is not the smartphone. If you are in that situation you need a well-calibrated threat model and a considerate OPSEC regime. There’s no way around that. And if you need it, it’s not like it’s hard to find a spot where glancing is impossible. Sit alone in a corner or go to a bathroom.
I think Signal has good reasons for the “smartphone first” paradigm.
For most people these days, the smartphone is their primary computer. It’s more common to have a smartphone than a laptop or desktop. One of the primary goals of Signal is to be accessible and that means being available on the most popular devices.
Smartphones are also more locked-down and more secure than general-purpose computers. I personally prefer using a laptop for most tasks, but that’s not for security reasons – I just find it more convenient and ergonomical.
I don’t believe it’s anywhere near reasonable to claim that E2EE is pointless on iOS. I don’t find this argument serious. AFAIK it’s still possible to use IDA Pro or Ghidra to analyze an iPhone app. Reproducible builds would be nice, but it’s not reasonable to claim E2EE isn’t possible just because reproducible builds aren’t available on a certain platform.
Smartphone users take screenshots of text all the time without thinking twice, and default smartphone configurations increasingly upload every file to the cloud automatically.
I think this is a good point! Again, for people where security matters, OPSEC is crucial.
Smartphones are also more locked-down and more secure than general-purpose computers.
Basic security precautions for non-profits and journalists in the United States, early 2019.
[…] - Don’t use an Android phone, use an iPhone instead.
The link doesn’t really explain how or why that’s the case. It also has statements like above which seem counter-intuitive without enough context. Is that to generalize your digital fingerprint or something similar?
It is true that Signal pushes people onto phones, but if other people glancing at your phone poses an actual threat, the problem is not the smartphone. If you are in that situation you need a well-calibrated threat model and a considerate OPSEC regime. There’s no way around that. And if you need it, it’s not like it’s hard to find a spot where glancing is impossible. Sit alone in a corner or go to a bathroom.
@ubernostrum said the user has to take deliberate measures to leak data with Signal, but is liable to leak data by accident with PGP. so we are talking about users who don’t have a considerate OPSEC regime, and I’m pointing out that Signal is not foolproof for those users either. we’re saying the same thing.
I think Signal has good reasons for the “smartphone first” paradigm.
For most people these days, the smartphone is their primary computer. It’s more common to have a smartphone than a laptop or desktop. One of the primary goals of Signal is to be accessible and that means being available on the most popular devices.
it’s not just “smartphone first,” it’s “you absolutely need a smartphone to use Signal.” the organization is not starved for resources, and they have decided to implement useless things like sealed sender when they could have implemented a non-smartphone SMS workflow for account creation and authentication. it certainly doesn’t seem like accessibility is a primary goal.
Smartphones are also more locked-down and more secure than general-purpose computers.
not for the threat models that Signal claims to accommodate.
I don’t believe it’s anywhere near reasonable to claim that E2EE is pointless on iOS. I don’t find this argument serious. AFAIK it’s still possible to use IDA Pro or Ghidra to analyze an iPhone app.
are you saying backdoors are detectable in iOS apps? that is news to me.
Reproducible builds would be nice, but it’s not reasonable to claim E2EE isn’t possible just because reproducible builds aren’t available on a certain platform.
the benefit of E2EE is that you don’t have to trust the service not to log your communications. if the service is not trusted, then what reason is there to believe their claim that an app uses E2EE and is not backdoored? under what threat model would you not trust a service to scan your communications if they passed through their servers in plaintext, but you would trust them to faithfully provide binaries with E2EE that is not broken?
Smartphone users take screenshots of text all the time without thinking twice, and default smartphone configurations increasingly upload every file to the cloud automatically.
I think this is a good point! Again, for people where security matters, OPSEC is crucial.
and yet you still think smartphones are more secure?
Why do you feel Sealed Sender is useless? This is actually one of my primary examples of Signal’s centralized model having advantages - they were able to simply eliminate a swath of metadata unilaterally.
Sealed sender seems like a perfect example of the confusion around Signal’s security model. As far as I can tell it doesn’t hide the sender in practice because Signal can still see sender’s IP address and can build a near-perfect mapping between users and their last used IP address. In theory maybe you can deliberately share IP addresses between a few accounts with the right VPN setup providing a small anonymity set for the sender, but that’s still largely broken if there’s an ongoing conversation where Signal sees the recipient of every message. Nobody seems willing to argue that this mostly useless security enhancement is worth the complexity. It seems like Signal added it mostly because it’s a fairly novel feature that they can write a blog post about and claim that they’re “raising the bar.” In other words they’re willing to add complexity that provides little practical benefit for marketing purposes, which is awfully concerning for an app that claims to be suitable for security-critical applications.
I could easily be missing something; if you think I am I would love to hear it. I think the part about the VPN setup might actually be wrong and overly generous to Signal.
No, it’s not exactly the same. Screenshotting a Signal message and sharing that is a deliberate act. Replying or forwarding a decrypted PGP message in plaintext is done all the time by accident.
Email is insecure. Even with PGP, it’s default-plaintext, which means that even if you do everything right, some totally reasonable person you mail, doing totally reasonable things, will invariably CC the quoted plaintext of your encrypted message to someone else (we don’t know a PGP email user who hasn’t seen this happen).
it’s also perfectly feasible for a smartphone user to screenshot a message for reference and have it automatically uploaded to the cloud by their smartphone.
the evidence for the problem on the PGP side is “we don’t know a PGP email user who hasn’t seen this happen,” but in the smartphone case you wouldn’t see it happen which is even worse. we don’t have the data to say that it’s an issue in one case and not the other.
sure, you can take screenshots on a computer, but you are not encouraged to by a horrible touch based UI and configurations that automatically suck your files into the cloud are much less common.
No. PGP email and S/MIME email clients attempt to solve problems that age does not attempt to solve. The interesting one, IMO, is associating public keys to recipient identities.
To use age for someone other than yourself, you need to get their key out-of-band and encrypt for it:
$ age -o example.jpg.age -r age1ql3z7hjy54pw3hyww5ayyfg7zqgvc7w3j2elw8zmrj2kg5sfn9aqmcac8p \
-r age1lggyhqrw2nlhcxprm67z43rta597azn8gknawjehu9d9dl0jq3yqqvfafg example.jpg
That’s part of what PGP and S/MIME clients are doing for you.
I don’t know what you mean by “out of band” for PGP — or rather, I don’t know what you would mean by “in band”. Practically speaking, whenever I’ve used PGP mail, I got the public key from a webpage or other “out of band” (?) source. The “web of trust” has never really worked in a general way. So there doesn’t seem like much of a difference, except there’s a local key store, and it would be pretty easy to leverage systemwide keychains now that OSes have them.
“In-band” vs “out-of-band” wasn’t quite the correct term. I mean that I could connect a PGP client or an S/MIME client up to a directory server of some sort, or that I could configure them to trust certain key signers, and they can encrypt messages for a recipient without me having to paste a key for each recipient in.
Whereas what’s being described for age means that I’d need to have each person’s key somewhere, paste it into the command, etc.
The email clients are trying (or did try) to solve this and age clients are not (yet).
Understood. In an ideal world there is an integrated and secure experience between mail client, directory/keyserver, and local keychain. PGP clients do this in theory, but because there’s no trusted directory (soatok’s Public Key Directory project) the practice falls short of the theory.
In the spirit of age (just make the right choices and version the whole thing) perhaps there’s a combination of newer building blocks that would give a smooth experience with better security and practical interoperability. I mean, other than Signal.
It sounds like something they “plan” to work on. It doesn’t sound like something I can tell someone to use now instead of PGP. Which is what the title of the article promised.
Most people, 99% of the time, just care about things like backups, sending files, or signing things. They don’t care about binding the trust relationship between a public key and an “identity” (term left vague due to different social interpretations of the word).
The article delivers on the things people care about, and says “I’m working on it” for the nerd use cases.
Everytime I see something written by someone in or adjacent to the cryptography community, I can’t help but wonder why are these people so toxic. What is the point of writing something like below?
The part of the free and open source software community that thinks PGP is just dandy are the same kind of people that happily use XMPP+OMEMO, Matrix, or weird Signal forks that remove forward secrecy and think it’s fine.
It’s a shame because the topic seems very interesting, but the community’s insistance on coming across as berating arrogant know-it-alls is really off putting for a beginner.
Reading the linked articles for the text you excerpted, it seems reasonable that they would hold disdain for those tools.
It’s frankly weird that tech people interpret criticism of a communications tool as throwing shade at the users of said tool. Nobody is being toxic to you.
I’m sure there’s some deeply interesting takeaways here, but I’m not a psych major.
It’s frankly weird that tech people interpret criticism of a communications tool as throwing shade at the users of said tool. Nobody is being toxic to you.
but:
The part of the free and open source software community that thinks PGP is just dandy are the same kind of people that happily use XMPP+OMEMO, Matrix, or weird Signal forks that remove forward secrecy and think it’s fine.
People speak strongly about these things because there are places in the world where listening to the wrong advice can get someone tortured to death by the secret police.
I suspect that it only feels harsh to non-experts because we don’t know enough to judge whether someone’s harshness is justified.
An astrophysicist being frustrated at flat-earthers is not “toxic” (well unless you are one of the people who thinks it’s never okay to show any emotion or else your argument in invalid).
Eh? The horrible UX of PGP is a pretty big reason why people like Soatok rail against it.
And this is just my opinion, but Matrix has horrendous UX. Somehow worse than Discord, I cannot comprehend it. Matrix’s encrypted stuff especially loves to break, which is a bad user experience.
Matrix & most XMPP clients tend to drop once-trusted keys when a users hasn’t used a client again & it is a royal PITA to get the other side to retrust an expired key. They use the same double ratchet encryption as Signal, but you are allowed to host your own decentralized server, & use multiple non-standard clients. Part of the UX issue with Matrix those is using web tech for Element’s chat (alternatives have nowhere near the features, fallbacks suck, & users don’t seem to understand non-Element clients might exist), but also waiting, & waiting for servers to ‘sync’ if you don’t keep a client perpetually open—& this is by design from the protocol (apparently you don’t need a few of the latest messages to get back into a conversation, instead you need an entire clone of the history on your machine or it won’t operate).
I’m not sure I would say that PGP/GnuPG was ever very wide-spread and it certainly hasn’t been gaining traction for the past 15 (20?) years or so. In 2024, I won’t hesitate to say that GnuPG is basically dead.
I have a lot of respect for what Phil Zimmermann did in the 90s, but that was a long time ago – cryptography has made huge leaps since, especially when we talk about applied cryptography. GnuPG is simply not very relevant these days. We have better tools for the problems that GnuPG attempted to solve. The exception being an e-mail-like system with good security properties – which is also discussed in the article.
I’m not an expert in this space, but my interpretation of this fact is not that “it takes nine separate tools to equal what GnuPG can do,” but rather “GnuPG performs nine different functions that—with the benefit of hindsight—did not need to be mashed together into one tool.”
Yes, and I agree with the interpretation. That’s the point that everybody has been making during all this time.
However, the initial question was:
When will the PGP-induced madness end?
And my observation was that we need to try to understand why it hasn’t.
The point being, we need to understand why all these features “wrongly” mashed together into one tool, are still being widely used, when theoretical better alternatives focused on a single problem exist. Even when there has been a considerable effort to promote them and “educate” the public.
If we figure that out, probably we can move forward.
One explanation, covered in the original Latacora article, is that PGP “is also a social network, and a subculture.”
The real world security of PGP has not been publicly tested, because it’s not very widely used, the people using it are mostly people not targeted by law enforcement, and its UX is not amenable to easy use by people who are targeted by law enforcement in the developed world, namely criminals.
It’s interesting to me that they specifically call out SSH signatures as being a good choice for git commits, but then they point to “sigstore” (which isn’t packaged in apt) for signing software you distribute.
I am 100% on board with PGP being hot garbage, and I’d love to switch away from it, but I’m not going to switch to an alternative that’s not in apt. Signing your software with SSH keys is pretty easy to do, and it’s also really easy to distribute your SSH keys across multiple various code hosts to ensure availability. Last I checked the main downside is that verifying a signature generated by ssh-keygen kinda sucks; you have to construct an allowed_signers file and run a cumbersome command. Is this shitty PGP-like usability the main reason SSH isn’t recommended for this?
In theory, a lot of these tools all support the same Ed25519 keys, right? Could you generate signatures using your SSH key and verify them with minisign or something?
It seems like this is possible. I was able to unpack SSH public & private ed25519 key files, and an SSH signature file, and reproduce both the signed content and verification in nacl in python. (Rust code to unpack these SSH files ended up in bitbottle if you like sample code.)
There isn’t a recommendation for encrypted email because that’s not a thing people should be doing.
Now, there exists a minority of extremely technical computer user for which Signal is a nonstarter (because you need a smartphone and valid phone number to enroll in the first place).
This is a misconception. The author seems to imply that only “extremely technical computer users” use e-mail, or at least that only “extremely technical computer users” would want to encrypt their e-mail, because if they weren’t “extremely technical”, they would be using Signal. There are absolutely valid reasons to not be “extremely technical” and still write an e-mail, the prime reason being probably a preference for asynchronous communication over synchronous communication, which has nothing to do with being “extremely technical” or not. Otherwise, writing paper letters via snail mail is something only “extremely technical computer users” are doing. In business communication e-mail is also still relevant. These days, e-mail goes mostly unencrypted (ignoring transport encryption for the moment), but that has been the case with instant messaging as well before WhatsApp started to encrypt everything by default. If the author’s logic was correct, only “extremely technical computer users” would have wanted to encrypt their WhatsApp messages, and by that measure, WhatsApp should never have introduced end-to-end encryption.
Bottom line: Synchronous and asynchronous communication are both valid approaches to human messaging in their own right, cannot be substituted with one another, and this has nothing to do with “extremely technical computer users“.
I mean, I agree with that article, if we are talking about long-term private communication. As noted before, for these we have much better channels. However for “short-lived-private-communication-that-will-be-public-soon-enough-anyway”, aka security reports and CVE that will be published after a patch is available, things email is still one of the easiest and most available solutions. It doesn’t require creating account anywhere (you probably already have email address), it is widely supported, it is accessible to anyone, it can be disposable, and so on.
I’d say that’s a misinterpretation of the position taken. It’s not about “want”, it’s about what exists. The position is that however much people might want it, there is no good way of doing it, and no path to a good way of doing it, and thus it can’t be recommended as something you do. And even if they would prefer secure email if it existed, because it doesn’t they should pick something else if the goal is security, even if it involves going against a preference (like being chat-like systems).
I think the position was not that it’s impossible, just that it doesn’t exist because cryptographers are focused on the needs of poor people rather than users whom their employers can derive revenue from, and therefore they haven’t gotten around to implementing a good solution yet.
Signal as a recommended replacement for email (or xmpp, or matrix, or whatever other messaging system) always baffles me. Signal has an abysmal UX, especially on desktop, uniquely so among its competitors.
Signal has an abysmal UX, especially on desktop, uniquely so among its competitors.
What’s bad about the Signal desktop UX? I like it myself; I’m wondering what you’re looking for that I’m evidently not. (My only complaint is that it makes it hard to copy someone else’s emoji reaction.)
To be honest, I usually like posts from this author. And this content lists a few interesting pieces of software that I’ve never heard of.
But the complete dismissal of encrypted emails by the author with (I’m paraphrasing) “Nobody needs that, see the XY problem. Only a few stubborn nerds want encrypted emails”, makes them almost sound like an arrogant jerk… :(
I find it disappointing, I would have loved a proposed alternative for secure, asynchronous, archive-able, federated, long-form communication. Not “just use signal, bro…”
The author is a cryptographer. What issues are you thinking of that PGP is designed to avoid that age does not? I haven’t heard of any in the 5+ years that age has been around. (I think your comment is bad-faith FUD.)
The author is a cryptographer? Alright. Isn’t everyone who wrote ANY more or less popular encryption tool or software that has to do with cryptography a cryptographer as well? Nonetheless, bugs in them have been getting discovered years and decades afterwards.
I’ve been playing with age and passage for a couple of hours tonight, and then I encountered that I can actually use my ssh ed25519 keypair with age as well. Also given that these keys are usually published on github.com/[username].keys, there is a directory of many of these for users I would like to interact with :).
So far i’ve been generating temporary keys for testing, but I now have keys for most of my colleages, and can send encrypted files to them using age/passage.
Would be nice if age would also talk to my ssh-agent, so I don’t need to re-enter my passphrase a million times, but hopefully something like that will be built and merged soon.
Sadly the ssh-agent protocol only supports signing, not decryption, so age cannot use it. I get the impression that if you want low-interaction decryption, age requires a plugin. (Most of the difficult parts of encrypting and decrypting files are not in scope for age, anything to do with managing keys: age does not do safe storage and controlled access to private keys; age does not do authentication of public keys.)
The ssh-agent protocol is annoyingly restricted for good reasons: it only supports authentication, ie signing, ie proof of possession of the private key. And it is a bad idea to re-use a key between signing and decryption because if you aren’t careful an attacker can fool you or your software to decrypt something when you thought you were signing something else. This is why modern cryptographic protocols include context strings, as protection against confused deputies. (There’s also some cryptographic judo to adapt an ed25519 signing key to an x25519 for key exchange and subsequently encryption.)
By contrast, the gpg-agent protocol is weirdly trusting. Whereas the ssh-agent holds your keys and acts as a signing oracle (like an HSM, but software) the gpg agent stores your passphrase, and hands it out to any software that asks. It also, weirdly, provides secure random numbers. When I have worked with gpg I thought again and again it drew its API boundaries in awkward places that made things more difficult than they should be.
It is of my opinion as a security engineer that specializes in applied cryptography that nobody should use PGP, because there’s always a better tool for the job you want to use PGP for.
…
Now, there exists a minority of extremely technical computer user for which Signal is a nonstarter (because you need a smartphone and valid phone number to enroll in the first place).
Because those people are generally not the highest priority of cryptographers (who are commonly focused on the privacy of people in poor countries where smartphones are more common than desktop computers), there presently isn’t really a good recommendation for private messaging that meets their constraints.
So there’s always a better tool for the job, except when there isn’t. Also, cryptographers are commonly focused on the privacy of people in poor countries. You heard it here first!
glad to see this. more awareness about the pgp issue even if it’s just mainly “here’s the latacora article again” is sorely needed. switching to the alternatives that applied to my use cases (ssh for commit signing, age for file encryption) has made it so much nicer for me
It seems like every post I see from this person makes me respect them less. They seem to have no care for what people want or need or what any actual problems or solutions might be and instead just like to list their favourite tech and say “use this”.
I wish the security industry would stop holding onto sacred cows, and stop having so many of them.
Choosing security tools is a lot about trust and it’s hard to trust new things, especially when the market is so saturated with garbage nowadays.
Aside from everyone becoming security and cryptography experts and auditing every line of security-sensitive code we use, the best we can do is find people we trust who do that kind of thing.
This (and linked PGP problem article) sounds like the moving the goalposts fallacy. Of course, if you redefine the goals, you get different solutions. Different answer to a different question.
One thing to note too, while thes article makes no argument against OpenPGP directly, the article it links to is mostly about issues with v4 which are fixed in v6
Is it practical to run OpenPGP in a “v6-only mode” that does not expose you to any of the v4 faults? I’m under the impression that a large part of the problem with PGP is due to its emphasis on backward compatibility. Coming out with new and better crypto is great, but it may not be much practical help if your tools all still support the old stuff.
Very much depends on the tool of course. But if you’re building something new I’d say OpenPGP v6-only subset is a great choice that will still have broader compatibility than various single implementation options.
What’s the status of the disagreement between GnuPG and OpenPGP? It sounded like there was going to be an interop failure which would make it difficult to reliably use better ciphers.
re: age - the author calls out in the post that there are two implementations - (age and rage) which is cool. But for me, the most important aspect is that age actually has a specification: https://age-encryption.org/v1 - so technically anyone can implement age in any language in which the crypto primitives are available. And once you are done, you can validate your implementation with: https://github.com/C2SP/CCTV/tree/main/age
This is definitely a huge plus, I helped write a Kotlin/JVM implementation for use in Android Password Store and we got great value out of the standardized test suite which paired nicely with JUnit’s dynamic tests feature to give us robust coverage for very little code.
I am aware that you have retired from the Android Password Store project. (Thank you for your work on it. I use it everyday.) But did age support make it into the app before that?
Unfortunately not
This line caught me off guard 😂
The one thing @soatok does not address at all is the original use case that PGP was designed for, as opposed to all the other ones people started using it for in time between when it was designed and when better options came along: encrypted email.
The closest they get is saying to “use signal” for secure messaging. But I do not trust signal and none of the reasons relate to the need to give them my phone number. And while @soatok correctly points out that my phone number need no longer be revealed to people who want to communicate with me, it still looks like signal-the-org demands that I give it to them.
If you want people to stop using PGP, you need to provide an alternative for its original purpose. And an alternative that requires me to send all my stuff through one org, even if I didn’t need to give them my phone number (and it looks like I do) is not adequate.
I know S/MIME addresses some of the concerns one might have about PGP email, but I wouldn’t say it’s better on balance, and I don’t think S/MIME as implemented in those mailers that still implement it addresses any of the criticisms from this article in any way.
Every time I see an article like this, I get optimistic that there will be a reasonable proposal for encrypted email. There never is.
The 2019 Latacora post handles this directly.
Aside from Signal, Soatok seems to address this use case (private messaging) from a different direction:
I see where they are coming from, but I find “don’t” an unacceptable answer to “how do I encrypt email?”
I am optimistic that work on the fediverse angle could yield an acceptable answer for email style asynchronous encrypted messaging, I’ll certainly be paying attention there.
But for now, I’m more comfortable trying to train a small number of collaborators on s/mime opsec than I am trusting signal. That’s because I know the pitfalls of s/mime better than I know the pitfalls of PGP. But I’d still be more comfortable trying to train a small number of collaborators on PGP opsec than trusting signal.
My understanding is that to get to the point of asking about whether you should encrypt email, you first have to make the fundamental mistake of trying to use email for secure messaging. And so the answer really is not “don’t encrypt your email”, it’s “don’t use email for secure messaging”.
All attempts to do encrypted “secure” email are basically just theater. There are so many weak links, technical and social, in the email infrastructure that you can personally set up the most bad-ass hard-ass ultra-mega-NSA-mil-spec XP++++ Pro grade encryption on your end, but any recipient who sees the plaintext can still quite easily quote-reply with the plaintext, or forward it, etc.
So PGP is more the cherry on top of the fecal sundae that is trying to make email into an acceptably secure messaging system. Which is why the answers always try to redirect you to an actual secure messaging system.
Should I say TLS is the cherry on top of the fecal sundae that is trying to make TCP an acceptably secure transmission system, just because there are a whole pile of ways that one party or another can terminate TLS in the wrong place and turn it into security theater? Back when I used to do pen tests, one particularly fun trick was to cause Firefox or IE to log its session keys to a place where I could grab them. It was at least as good as most of the email fun you could pull off, and got you a lot more data.
I opened the bug on the Evolution (GNOME email client) tracker about how it silently turned off encryption on encrypted email messages when you replied, even if you had a public key for everyone else on the thread. I am 99% certain it took over 10 years for that to get fixed. (I believe it was over 15, but I can’t find the notification from when it was finally closed on this laptop.)
I viscerally appreciate what you’re saying about email not being fit for this purpose.
But until we get a modern answer with the properties of email that doesn’t involve me trusting signal or whatsapp, it’s easier for me to work around the deficiencies of email in this regard.
I’ve tried twice now to set up a secure, closed messaging system, which is an easier version of what I’m saying I want here. I tried once using email infrastructure and once using XMPP infrastructure. Email was smoother. I’d like a better alternative.
Leaving your door wide open because you don’t like a particular brand of lock does not strike me as a great security stance.
But then I also don’t know why you have issues with “trust” with Signal, which is open source.
Signal is not meaningfully open source. A single entity controls the (not open source) server, and the production of all binaries anyone uses.
Isn’t the entire point of client-side end-to-end encryption that it shouldn’t matter what the server is doing?
And the whole binary thing is kind of endemic to the mobile ecosystem for regular users (i.e., people who have never heard of F-Droid outside of possibly some spicy fanfics).
Sure, but the key distribution is done by their server for example. Lots of important parts under their control despite the e2ee
By “key disytibution”, are you referring to SignedPreKey bundles, which are signed by the IdentityKey stored on the user’s device?
Because this is an X25519 public key and a signature.
Calling it “key distribution” is vague and makes it almost sound sinister.
Who cares if their server is doing key distribution? Surely you’re not trusting any server’s key distribution, federated or not, if you really care about message security?
You can get someone’s signal public key without trusting signal’s server? How?
You can’t.
Safety numbers exist, but that’s a sucky mitigation.
Key transparency is a much better mechanism, but only WhatsApp has it.
but if the server were to provide the wrong public key, it would still be detectable right? can’t you still compare your public key to the public key that your contact has stored for you, if you are with them in person? wouldn’t that relieve you of the need to trust signal’s servers?
short of manually comparing keys, if you trust the client software on both ends, then exchanging messages and verifying in person that they were received should confirm that you have the right public keys on file, right? so that’s another way that you would not have to trust the server.
How often do people actually do this?
probably never. I just think we should be careful about saying “you can’t” exchange public keys without trusting Signal’s server. besides, is WhatsApp’s “key transparency” any different?
Yes. The security model of “key transparency’ is that you can build automation that detects when an errant public key has been issued, or if the public key received isn’t in the ledger. You don’t have to rely on squishy humans to do anything.
Additionally, it creates a disincentive for the kind of nation state actors that might perform such an active attack against a target of interest. Whereas we see mechanisms like National Security Letters with a built-in gag order, key transparency prevents stealth. In order to attempt to attack the other person’s communications, you have to first announce yourself to the network.
(This guarantee is much stronger with open source software and reproducible builds, of course.)
is that basically the same as a PGP key server?
is the WhatsApp key directory / audit record list actually publicly available? I haven’t been able to find it.
Nope. The PGP key server can lie about a lot of things. There is no audit trail (i.e., Merkle Tree proof of inclusion) to guarantee a time period. There are no witness co-signatures. There’s nothing.
For a while, GnuPG wouldn’t even bother to check the fingerprint of the received key versus what was requested.
Interesting. So does the WhatsApp public key directory actually exist or did they just write a blog post about it?
I take it it doesn’t exist?
@caleb, I’m currently visiting family for a much-needed vacation from technical work. Whether something exists or not is something you’re free to explore on your own. As I have no financial stake in anyone believing anything, I’m going to prioritize spending time with my nephews and drinking enough eggnog to make my physician give me the angry teacher stare when I return in January. I wish you the best.
I’m replying to you here (not the user you’re replying to, but I was interested when I saw you bump it), because it turns out that’s actually the max reply depth and so no-one can reply to you.
I googled around, found the key transparency whitepaper, which includes a link (page 11) to the audit proofs.
Fwiw, it appears Cloudflare audits these as a matter of course, now, and have a dashboard showing the status of their auditing.
I was surprised to learn there’s a max depth to replies but not surprised @caleb is the one who plumbed it.
that’s really helpful! you’re a better googler than I am.
I thought the server was open source? (AGPL-3.0 license): https://github.com/signalapp/Signal-Server Seems to get actively published to, at the time of this writing the last commit was 5 hours ago.
I seem to remember some people have verified the client binaries for Android at least once. I can’t seem to find a link for it though, perhaps my memory is faulty.
The server is open source except for the spam filter, which is pretty reasonable. I believe all the other server-side components, particularly SGX enclave implementations, are also open source (though I didn’t double-check).
I’m not leaving the door wide open. Like I said, it’s a closed system that we’re building on email protocols. We use filtering to prevent the kinds of accidental leaks you’re discussing in other messages. We hand out specifically configured clients. And we use servers configured to block plaintext messaging. We tried something similar with XMPP, but it was easier on the people who were responsible for infrastructure using email things, and people using the system liked the email interface better.
One of the reasons is because in order to sign up, I have to surrender a phone number, and they are not open enough for me to tell what they might be doing with that. There’s no way for me to see what they’re actually doing on the server side so I can infer it, either.
Email requires you to “surrender” huge amounts of useful information – enough that the secret police might not even need to decrypt the body of the messages to get enough evidence to convict you.
But sure, a phone-based tool that initially used phone numbers (disposable ones are easily available, or so I hear) to avoid the need for centralized storage of things like contact lists and personal data is the real risk, and email is just fine an peachy and dandy. Now let me just go see who’s knocking so loudly at mydoor…
But I can run my own email server. And it’s trivial to give an account on there to anyone whose phone number I have (which is how you found people on Signal, last time I tried it…)
I’m not worrying about surrendering info to myself.
Thanks for the thought exercise. I would probably consider signal a great replacement for email if I could stand up my own server and have my contacts connect their signal apps to it. I suspect that’s a smaller lift now than it used to be, since they’ve started publishing server code again, and might hopefully even simply be a skill issue at this point.
So you can do your own, more complicated and more easily seized thing that still likely leaks plaintext metadata to the secret police, and you think that’s better than something that uses a phone number – which you can easily obtain as a disposable resource – to bootstrap itself without requiring a centralized account system.
I have admittedly not yet tried very hard, but every place around here that I’ve checked requires a government ID for this.
Looking through the documentation on signal’s site, I don’t see anything explaining why they need it, nor making me expect that it’s disposable. Don’t I need to keep that number as long as I have my signal account?
They do a great job covering the crypto, which is good. Their privacy documentation does not explain, at least in any place that I’ve yet found, what they’re doing with my phone number and how they prevent it from being metadata associated with my encrypted messages.
Signal and their infrastructure is in the same physical jurisdiction I’m in. So police that are interested in me can go after things in Signal’s hosting facility. I can host my own stuff in a different physical jurisdiction, where it’d be less easily seized than Signal’s. That is part of the appeal of self-hosting.
And our police aren’t secret. There are just signs that they’re becoming more worrisome.
Our conversation has persuaded me to continue learning about signal. I know you’re a serious person, and your technical judgement often aligns with mine, so I want to see if it’d work for what I’m putting together.
A thing I’m finding missing from the signal software is the ability to use hardware tokens for crypto. I’d like to have my private keys stored on an external token, and I don’t see a way to do that even if I spend time learning to self-host signal. It looks like that just isn’t a feature they offer, even for their desktop software.
It’s starting to look like, even if my trust issues could be resolved, what I want to do is simply more complicated than what signal supports at the moment.
I’d love to contribute to building something that uses signal’s crypto and protocols with hardware keys at the endpoints, an interface and behavior that’s more like email than SMS (particularly as regards attachments), coordinated on a self-hostable server.
But I need to get this bootstrapped at an in-person meeting that will take place between mid-December and late January. It’ll be a “party” where we distribute tokens that I’ll have sourced, generate keys, configure wireguard and FDE on everyone’s PCs, connect via wireguard to an email server which only accepts traffic over that wireguard connection, doesn’t relay traffic to other email servers, and refuses all plaintext mail.
I don’t think the thing I’d like to contribute to can be ready in that timeframe, and I don’t see where it’s been built already for me. I do think I can get the server I’ve described hosted in an appropriate place and configured.
I don’t think we can have a productive conversation about this because I fundamentally don’t think your apparent preference for spending huge amounts of effort kinda-sorta securing a thing that’s fundamentally un-securable is something that I can engage with or talk you out of. The lengths you’re prepared to go to in order to avoid the possibility of “just use a burner phone” frankly astound me.
We probably can’t. Your insistence on dismissing features like “hardware tokens for private keys”, “richer messaging than SMS (or, we need real attachments)” and “hosted in another jurisdiction” as if it boils down to “not wanting to use a burner phone” means we’re talking past each other.
Your confidence that reasonably anonymous burner phones can be had here is also, curiously, much higher than mine.
But thank you (sincerely!) for discussing it this far.
what makes @hoistbypetard’s proposed system “fundamentally un-securable” and why don’t the same fundamentals apply to Signal? it sounds like his proposal can be more secure because keys and software is set up in-person. aren’t the fundamental limitations more on the Signal side because they are targeting remote key and software distribution, which is much harder to secure? I would be really curious if you could unpack this to any degree.
This was covered in the article. Adding someone on signal only requires a username these days. A phone number is required to sign up, which is damn easy to buy with cash in most parts of the world.
That’s good. Do you know why signal still needs a phone number from you, though?
Last time I tried to buy a phone number with cash here, the person selling it was pretty insistent on having a copy of a government-issued ID. This was on the east coast of the US. Had I cared more in that case, I think I very likely could’ve got one somewhere else without showing ID, but the ID requirement and the cameras all around the store didn’t make being able to use cash feel like it improved anonymity in any meaningful way.
For my application, I don’t actually mind giving my phone number to the people I’m messaging, and vice versa. We’re all within 2 degrees of each other IRL. I am not yet comfortable with signal having mine and my contacts’ numbers associated with each other.
Many places I have been require a copy of your passport to get a SIM card.
Exactly the same is true of signal though. Someone could screenshot your message and share that. If “the recipient could choose to willingly leak the plaintext” is in your threat model, there’s not much you can do.
In the case of Signal, the recipient has to choose to leak it and take active measures to do so.
In the case of email, it’s far too easy for the recipient to accidentally leak it, because they might not have the same super-ultra-opsec config the sender does.
Signal pushes people onto phones, which are more likely to be used in public places where people can see your messages over your shoulder.
Signal also support iOS where the binaries cannot be verified against the published code, so E2EE doesn’t mean much there.
The “just use Signal” mantra also doesn’t help because people are given a false sense of security. With email at least there is a general awareness of the need to be careful and consult a nerd if you actually want to protect your communications.
Smartphone users take screenshots of text all the time without thinking twice, and default smartphone configurations increasingly upload every file to the cloud automatically.
Excuse me, but I find some of these arguments quite weak.
It is true that Signal pushes people onto phones, but if other people glancing at your phone poses an actual threat, the problem is not the smartphone. If you are in that situation you need a well-calibrated threat model and a considerate OPSEC regime. There’s no way around that. And if you need it, it’s not like it’s hard to find a spot where glancing is impossible. Sit alone in a corner or go to a bathroom.
I think Signal has good reasons for the “smartphone first” paradigm.
For most people these days, the smartphone is their primary computer. It’s more common to have a smartphone than a laptop or desktop. One of the primary goals of Signal is to be accessible and that means being available on the most popular devices.
Smartphones are also more locked-down and more secure than general-purpose computers. I personally prefer using a laptop for most tasks, but that’s not for security reasons – I just find it more convenient and ergonomical.
I don’t believe it’s anywhere near reasonable to claim that E2EE is pointless on iOS. I don’t find this argument serious. AFAIK it’s still possible to use IDA Pro or Ghidra to analyze an iPhone app. Reproducible builds would be nice, but it’s not reasonable to claim E2EE isn’t possible just because reproducible builds aren’t available on a certain platform.
I think this is a good point! Again, for people where security matters, OPSEC is crucial.
The link doesn’t really explain how or why that’s the case. It also has statements like above which seem counter-intuitive without enough context. Is that to generalize your digital fingerprint or something similar?
@ubernostrum said the user has to take deliberate measures to leak data with Signal, but is liable to leak data by accident with PGP. so we are talking about users who don’t have a considerate OPSEC regime, and I’m pointing out that Signal is not foolproof for those users either. we’re saying the same thing.
it’s not just “smartphone first,” it’s “you absolutely need a smartphone to use Signal.” the organization is not starved for resources, and they have decided to implement useless things like sealed sender when they could have implemented a non-smartphone SMS workflow for account creation and authentication. it certainly doesn’t seem like accessibility is a primary goal.
not for the threat models that Signal claims to accommodate.
are you saying backdoors are detectable in iOS apps? that is news to me.
the benefit of E2EE is that you don’t have to trust the service not to log your communications. if the service is not trusted, then what reason is there to believe their claim that an app uses E2EE and is not backdoored? under what threat model would you not trust a service to scan your communications if they passed through their servers in plaintext, but you would trust them to faithfully provide binaries with E2EE that is not broken?
and yet you still think smartphones are more secure?
Why do you feel Sealed Sender is useless? This is actually one of my primary examples of Signal’s centralized model having advantages - they were able to simply eliminate a swath of metadata unilaterally.
I wrote a comment about it here.
I could easily be missing something; if you think I am I would love to hear it. I think the part about the VPN setup might actually be wrong and overly generous to Signal.
No, it’s not exactly the same. Screenshotting a Signal message and sharing that is a deliberate act. Replying or forwarding a decrypted PGP message in plaintext is done all the time by accident.
Quoting from the Latacora piece https://latacora.github.io/blog/2019/07/16/the-pgp-problem/#encrypting-email
it’s also perfectly feasible for a smartphone user to screenshot a message for reference and have it automatically uploaded to the cloud by their smartphone.
the evidence for the problem on the PGP side is “we don’t know a PGP email user who hasn’t seen this happen,” but in the smartphone case you wouldn’t see it happen which is even worse. we don’t have the data to say that it’s an issue in one case and not the other.
sure, you can take screenshots on a computer, but you are not encouraged to by a horrible touch based UI and configurations that automatically suck your files into the cloud are much less common.
Isn’t simply sending attachments in age format functionally equivalent to PGP email?
No. PGP email and S/MIME email clients attempt to solve problems that age does not attempt to solve. The interesting one, IMO, is associating public keys to recipient identities.
To use age for someone other than yourself, you need to get their key out-of-band and encrypt for it:
That’s part of what PGP and S/MIME clients are doing for you.
I don’t know what you mean by “out of band” for PGP — or rather, I don’t know what you would mean by “in band”. Practically speaking, whenever I’ve used PGP mail, I got the public key from a webpage or other “out of band” (?) source. The “web of trust” has never really worked in a general way. So there doesn’t seem like much of a difference, except there’s a local key store, and it would be pretty easy to leverage systemwide keychains now that OSes have them.
“In-band” vs “out-of-band” wasn’t quite the correct term. I mean that I could connect a PGP client or an S/MIME client up to a directory server of some sort, or that I could configure them to trust certain key signers, and they can encrypt messages for a recipient without me having to paste a key for each recipient in.
Whereas what’s being described for age means that I’d need to have each person’s key somewhere, paste it into the command, etc.
The email clients are trying (or did try) to solve this and age clients are not (yet).
That’s all I meant.
Understood. In an ideal world there is an integrated and secure experience between mail client, directory/keyserver, and local keychain. PGP clients do this in theory, but because there’s no trusted directory (soatok’s Public Key Directory project) the practice falls short of the theory.
In the spirit of
age(just make the right choices and version the whole thing) perhaps there’s a combination of newer building blocks that would give a smooth experience with better security and practical interoperability. I mean, other than Signal.From the article:
I dunno, seems like this is something the author is working on solving to me?
It sounds like something they “plan” to work on. It doesn’t sound like something I can tell someone to use now instead of PGP. Which is what the title of the article promised.
Right, but this isn’t a common use-case.
Most people, 99% of the time, just care about things like backups, sending files, or signing things. They don’t care about binding the trust relationship between a public key and an “identity” (term left vague due to different social interpretations of the word).
The article delivers on the things people care about, and says “I’m working on it” for the nerd use cases.
Everytime I see something written by someone in or adjacent to the cryptography community, I can’t help but wonder why are these people so toxic. What is the point of writing something like below?
It’s a shame because the topic seems very interesting, but the community’s insistance on coming across as berating arrogant know-it-alls is really off putting for a beginner.
Right? Like, “can you imagine? they use Matrix? *scoffs*”. Who needs this?
Reading the linked articles for the text you excerpted, it seems reasonable that they would hold disdain for those tools.
It’s frankly weird that tech people interpret criticism of a communications tool as throwing shade at the users of said tool. Nobody is being toxic to you.
I’m sure there’s some deeply interesting takeaways here, but I’m not a psych major.
but:
If you look at the author’s recent writings, it’s pretty clear to me that he’s discussing evangelists not regular users.
that’s a stretch but it’s beside the point anyway. evangelists are users, not tools. at least not literally.
Pretty sure we could leave this sentence out.
Isn’t that the discipline concerned with studying human behavior?
People speak strongly about these things because there are places in the world where listening to the wrong advice can get someone tortured to death by the secret police.
In such countries you can be punished just for „doing cryptography“, „having the tool“ or simply without any real reason.
That’s not a reason to use insecure tools.
This is not the threat model of most of the people who speak this way.
I suspect that it only feels harsh to non-experts because we don’t know enough to judge whether someone’s harshness is justified.
An astrophysicist being frustrated at flat-earthers is not “toxic” (well unless you are one of the people who thinks it’s never okay to show any emotion or else your argument in invalid).
[Comment removed by author]
Eh? The horrible UX of PGP is a pretty big reason why people like Soatok rail against it.
And this is just my opinion, but Matrix has horrendous UX. Somehow worse than Discord, I cannot comprehend it. Matrix’s encrypted stuff especially loves to break, which is a bad user experience.
Matrix & most XMPP clients tend to drop once-trusted keys when a users hasn’t used a client again & it is a royal PITA to get the other side to retrust an expired key. They use the same double ratchet encryption as Signal, but you are allowed to host your own decentralized server, & use multiple non-standard clients. Part of the UX issue with Matrix those is using web tech for Element’s chat (alternatives have nowhere near the features, fallbacks suck, & users don’t seem to understand non-Element clients might exist), but also waiting, & waiting for servers to ‘sync’ if you don’t keep a client perpetually open—& this is by design from the protocol (apparently you don’t need a few of the latest messages to get back into a conversation, instead you need an entire clone of the history on your machine or it won’t operate).
Probably it won’t. Now we can try to understand why, before creating yet another tool/system to “kill” PGP.
I’m not sure I would say that PGP/GnuPG was ever very wide-spread and it certainly hasn’t been gaining traction for the past 15 (20?) years or so. In 2024, I won’t hesitate to say that GnuPG is basically dead.
I have a lot of respect for what Phil Zimmermann did in the 90s, but that was a long time ago – cryptography has made huge leaps since, especially when we talk about applied cryptography. GnuPG is simply not very relevant these days. We have better tools for the problems that GnuPG attempted to solve. The exception being an e-mail-like system with good security properties – which is also discussed in the article.
If PGP were effectively dead, I guess we wouldn’t read articles like this every couple of months.
Yes, the article mentions 9 of them.
I’m not an expert in this space, but my interpretation of this fact is not that “it takes nine separate tools to equal what GnuPG can do,” but rather “GnuPG performs nine different functions that—with the benefit of hindsight—did not need to be mashed together into one tool.”
Yours is the correct interpretation.
Yes, and I agree with the interpretation. That’s the point that everybody has been making during all this time.
However, the initial question was:
And my observation was that we need to try to understand why it hasn’t.
The point being, we need to understand why all these features “wrongly”
mashed together into one tool, are still being widely used, when theoretical better alternatives focused on a single problem exist. Even when there has been a considerable effort to promote them and “educate” the public.If we figure that out, probably we can move forward.
One explanation, covered in the original Latacora article, is that PGP “is also a social network, and a subculture.”
The real world security of PGP has not been publicly tested, because it’s not very widely used, the people using it are mostly people not targeted by law enforcement, and its UX is not amenable to easy use by people who are targeted by law enforcement in the developed world, namely criminals.
It’s interesting to me that they specifically call out SSH signatures as being a good choice for git commits, but then they point to “sigstore” (which isn’t packaged in apt) for signing software you distribute.
I am 100% on board with PGP being hot garbage, and I’d love to switch away from it, but I’m not going to switch to an alternative that’s not in apt. Signing your software with SSH keys is pretty easy to do, and it’s also really easy to distribute your SSH keys across multiple various code hosts to ensure availability. Last I checked the main downside is that verifying a signature generated by
ssh-keygenkinda sucks; you have to construct anallowed_signersfile and run a cumbersome command. Is this shitty PGP-like usability the main reason SSH isn’t recommended for this?In theory, a lot of these tools all support the same Ed25519 keys, right? Could you generate signatures using your SSH key and verify them with minisign or something?
It seems like this is possible. I was able to unpack SSH public & private ed25519 key files, and an SSH signature file, and reproduce both the signed content and verification in nacl in python. (Rust code to unpack these SSH files ended up in bitbottle if you like sample code.)
This is a misconception. The author seems to imply that only “extremely technical computer users” use e-mail, or at least that only “extremely technical computer users” would want to encrypt their e-mail, because if they weren’t “extremely technical”, they would be using Signal. There are absolutely valid reasons to not be “extremely technical” and still write an e-mail, the prime reason being probably a preference for asynchronous communication over synchronous communication, which has nothing to do with being “extremely technical” or not. Otherwise, writing paper letters via snail mail is something only “extremely technical computer users” are doing. In business communication e-mail is also still relevant. These days, e-mail goes mostly unencrypted (ignoring transport encryption for the moment), but that has been the case with instant messaging as well before WhatsApp started to encrypt everything by default. If the author’s logic was correct, only “extremely technical computer users” would have wanted to encrypt their WhatsApp messages, and by that measure, WhatsApp should never have introduced end-to-end encryption.
Bottom line: Synchronous and asynchronous communication are both valid approaches to human messaging in their own right, cannot be substituted with one another, and this has nothing to do with “extremely technical computer users“.
Everyone is free to use email. No-one should be under the impression that email can be secured in a fool-proof way with encryption.
https://www.latacora.com/blog/2020/02/19/stop-using-encrypted/
I mean, I agree with that article, if we are talking about long-term private communication. As noted before, for these we have much better channels. However for “short-lived-private-communication-that-will-be-public-soon-enough-anyway”, aka security reports and CVE that will be published after a patch is available, things email is still one of the easiest and most available solutions. It doesn’t require creating account anywhere (you probably already have email address), it is widely supported, it is accessible to anyone, it can be disposable, and so on.
I’d say that’s a misinterpretation of the position taken. It’s not about “want”, it’s about what exists. The position is that however much people might want it, there is no good way of doing it, and no path to a good way of doing it, and thus it can’t be recommended as something you do. And even if they would prefer secure email if it existed, because it doesn’t they should pick something else if the goal is security, even if it involves going against a preference (like being chat-like systems).
I think the position was not that it’s impossible, just that it doesn’t exist because cryptographers are focused on the needs of poor people rather than users whom their employers can derive revenue from, and therefore they haven’t gotten around to implementing a good solution yet.
Signal as a recommended replacement for email (or xmpp, or matrix, or whatever other messaging system) always baffles me. Signal has an abysmal UX, especially on desktop, uniquely so among its competitors.
What’s bad about the Signal desktop UX? I like it myself; I’m wondering what you’re looking for that I’m evidently not. (My only complaint is that it makes it hard to copy someone else’s emoji reaction.)
To be honest, I usually like posts from this author. And this content lists a few interesting pieces of software that I’ve never heard of.
But the complete dismissal of encrypted emails by the author with (I’m paraphrasing) “Nobody needs that, see the XY problem. Only a few stubborn nerds want encrypted emails”, makes them almost sound like an arrogant jerk… :(
Add to this the strong recommendation of the highly objectionable Signal, which used to be (still is?) an alleged cryptocurrency pump-and-dump, and the author’s credibility has now taken a hit in my eyes…
I find it disappointing, I would have loved a proposed alternative for secure, asynchronous, archive-able, federated, long-form communication. Not “just use signal, bro…”
Yes, you are.
Since “age” has been around only for a short while, how can the author know whether it’s not subject to some issues that PGP by design aren’t?
The author is a cryptographer. What issues are you thinking of that PGP is designed to avoid that
agedoes not? I haven’t heard of any in the 5+ years thatagehas been around. (I think your comment is bad-faith FUD.)The author is a cryptographer? Alright. Isn’t everyone who wrote ANY more or less popular encryption tool or software that has to do with cryptography a cryptographer as well? Nonetheless, bugs in them have been getting discovered years and decades afterwards.
I’ve been playing with age and passage for a couple of hours tonight, and then I encountered that I can actually use my ssh ed25519 keypair with age as well. Also given that these keys are usually published on github.com/[username].keys, there is a directory of many of these for users I would like to interact with :).
So far i’ve been generating temporary keys for testing, but I now have keys for most of my colleages, and can send encrypted files to them using age/passage.
Would be nice if age would also talk to my ssh-agent, so I don’t need to re-enter my passphrase a million times, but hopefully something like that will be built and merged soon.
Sadly the ssh-agent protocol only supports signing, not decryption, so
agecannot use it. I get the impression that if you want low-interaction decryption,agerequires a plugin. (Most of the difficult parts of encrypting and decrypting files are not in scope forage, anything to do with managing keys:agedoes not do safe storage and controlled access to private keys;agedoes not do authentication of public keys.)The ssh-agent protocol is annoyingly restricted for good reasons: it only supports authentication, ie signing, ie proof of possession of the private key. And it is a bad idea to re-use a key between signing and decryption because if you aren’t careful an attacker can fool you or your software to decrypt something when you thought you were signing something else. This is why modern cryptographic protocols include context strings, as protection against confused deputies. (There’s also some cryptographic judo to adapt an ed25519 signing key to an x25519 for key exchange and subsequently encryption.)
By contrast, the gpg-agent protocol is weirdly trusting. Whereas the ssh-agent holds your keys and acts as a signing oracle (like an HSM, but software) the gpg agent stores your passphrase, and hands it out to any software that asks. It also, weirdly, provides secure random numbers. When I have worked with gpg I thought again and again it drew its API boundaries in awkward places that made things more difficult than they should be.
So there’s always a better tool for the job, except when there isn’t. Also, cryptographers are commonly focused on the privacy of people in poor countries. You heard it here first!
Have you never read the Signal blog?
are you directing my attention to something in particular?
Yes, that you certainly haven’t “heard it here first” because the attitude expressed here is a common refrain from the Signal blog.
my mistake heh heh
glad to see this. more awareness about the pgp issue even if it’s just mainly “here’s the latacora article again” is sorely needed. switching to the alternatives that applied to my use cases (ssh for commit signing, age for file encryption) has made it so much nicer for me
It seems like every post I see from this person makes me respect them less. They seem to have no care for what people want or need or what any actual problems or solutions might be and instead just like to list their favourite tech and say “use this”.
Soatok’s niche is applied cryptography. They are going to have a lot of opinions on tech tools because that is where cryptography gets applied.
I was looking forward to a discussion about S/MIME vs PGP and also a discussion about signing email, but the author did not discuss that.
I wish the security industry would stop holding onto sacred cows, and stop having so many of them.
Choosing security tools is a lot about trust and it’s hard to trust new things, especially when the market is so saturated with garbage nowadays.
Aside from everyone becoming security and cryptography experts and auditing every line of security-sensitive code we use, the best we can do is find people we trust who do that kind of thing.
TIL about Proton’s (go)penpgp library (previously).
More re: Proton’s PGP crypto refresh.
I’m kinda unconvinced one way or the other: they clearly dislike PGP, yet basically admit it works fine. So …
This (and linked PGP problem article) sounds like the moving the goalposts fallacy. Of course, if you redefine the goals, you get different solutions. Different answer to a different question.
One thing to note too, while thes article makes no argument against OpenPGP directly, the article it links to is mostly about issues with v4 which are fixed in v6
Is it practical to run OpenPGP in a “v6-only mode” that does not expose you to any of the v4 faults? I’m under the impression that a large part of the problem with PGP is due to its emphasis on backward compatibility. Coming out with new and better crypto is great, but it may not be much practical help if your tools all still support the old stuff.
Very much depends on the tool of course. But if you’re building something new I’d say OpenPGP v6-only subset is a great choice that will still have broader compatibility than various single implementation options.
What’s the status of the disagreement between GnuPG and OpenPGP? It sounded like there was going to be an interop failure which would make it difficult to reliably use better ciphers.
Sequoia has developed drop in replacement for gpg so you can keep using software that relies on it but comply with the standard