1. 14

First line of the article: “TL;DR: No, it isn’t. If that’s all you wanted to know, you can stop reading.”

  1.  

  2. 5

    Detailed and well-written explanation. It really irritated me how ignorant people can just throw out FUD comments like the tweet the blog post is responding to and influence the debate, then force people to spend disproportionate effort (i.e. the research for this blog post) to dispute.

    1. 4

      This article is weak in so many different ways. Let’s just go through a few points that jump out.

      “This leaves us with basically one option: a user password. ”

      “So Apple finds itself in a situation where they can’t trust the user to pick a strong password.”

      There’s been many strategies with various tradeoffs. Giving them the password isn’t the only one. Let’s work on that one as I already see objections coming involving people being too lazy, other stuff they’ll loose, etc. I know Apple will say it at least although they likely want this centralized & push-button for reasons that aren’t about privacy.

      “Rather than trusting Apple, your phone encrypts its secrets under a hardcoded 2048-bit RSA public key that belongs to Apple’s HSM. ”

      Rather than trusting Apple, you trust all developers, admins, HW engineers, suppliers, etc involved in the HSM. Unlike most things, the nature of this product and small number of suppliers means High-Strength Attackers will all consider targeting them or their employees. The theory is that there’s more scrutiny on them and incentive for the companies to play honest. The truth was shown with Crypto AG, RSA, U.S. telecoms, and others: many will risk their entire business for short-term earnings (bribes) if they think low chances of getting caught. Double that if they think it helps them in long-term dominating markets with PR teams dealing with bad press.

      “your phone encrypts its secrets under a hardcoded 2048-bit RSA public key that belongs to Apple’s HSM”

      “Critically, only the HSM has a copy of the corresponding RSA decryption key,”

      My studies on HSM’s showed many could import or export private keys. This is especially useful for backup/recovery purposes. This also means the keys could be shared with parties doing surveillance. If that’s true, how do you know your password only went to a HSM if encrypted with those keys?

      “provided that the HSM works as designed”

      HSM’s, esp supporting software, have all kinds of problems. People that work on them tell me about them. They all say they rarely talk publicly about it due to threat of losing job or getting sued. I’m not posting further on this point except to say it’s a matter of faith until proven with rigorous evaluation that only a few have. Like any security claim.

      “This rules out both malicious insiders and government access”

      No it doesn’t. See “rather than trusting Apple” above. The government’s own standards, EAL6-7 and NIPSOM, show in theory plus practice with prior evaluations that it takes a lot to stop all the threats most of the time. Not everything or always, just mostly. There’s no way these money-grubbing companies did all that. They certainly didn’t claim it in evaluation. I also didn’t see any Snowden slides where the U.S. government griped about targets using HSM’s despite a bunch of them doing so. That might be due to pervasive insecurity where they just hit something else. Yet, I’m still concerned given most HSM’s come from defense contractors that only care about the bottom line in jurisdictions known to bribe or secretly compel parties for surveillance boosts. Assume they have your stuff if you solely use a HSM to protect it.

      “You see, the HSMs Apple uses are programmable. ”

      Apple says they’re worried about High-Strength Attackers. They take a stand on privacy. They can choose between non-programmable and programmable solutions. One dramatically increases the risks of subversion and hacking. They go with that one. Now to explain why it’s still trustworthy.

      “programming the HSM to output decrypted escrow keys. ” “Or disabling the maximum login attempt counting mechanism. Or even inserting a program that runs a brute-force dictionary attack on the HSM itself.”

      Wait, didn’t I guess that escrow one immediately? It’s not a problem he (they) will tell us. Let’s see why.

      “Note that on HSMs like the one Apple is using, the code signing keys live on a special set of admin smartcards.”

      What about the code they signed? Did they publish it? If not, they’re developers are at least virtuous people working for virtuous company. Next concern.

      “We run the cards through a blender.”

      Will not comfort anyone whose read Anderson’s Security Engineering about difficulty of exploding nuclear secrets or has heard of data recovery services pulling evidence off of shredded or burned disks. They were at least smartcards (see escrow & organizational risks above). So, if they were secure smartcards (oh no) & not subverted, then their design will probably prevent recovery. There’s a lot of smartcards that aren’t secure, though. Still probably hard if ground up. The card’s contents were probably destroyed they assure me. Moving on.

      “Pretty much the only action Apple can take is to wipe the HSM”

      Assuming they didn’t export the keys on HSM or smartcard. Assuming the software they loaded doesn’t have an exploitable backdoor, covert channel, etc. Assuming they aren’t cooperating with anyone possessing vulnerabilities in any of the products. Assuming they can’t contract someone to beat HSM’s tamper mechanisms like researchers have done for many devices (esp smartcards). Assuming a classic, EMSEC attack isn’t possible to pick up the internal secrets like NSA, Russia, Israel, etc been doing for decades now. A few assumptions being true, some weaker than others, then his statement would be true.

      “The downside for Apple, of course, is that there had better not be a bug in any of their programming. ”

      Look at Apple’s security track record as you assess that. Come to your own conclusions.

      “To be sure, Apple’s reliance on a Hardware Security Module indicates a great deal of faith in a single hardware/software solution for storing many keys. Only time will tell if that faith is really justified. ”

      There’s plenty of people I hacked that never knew it happened. There’s tons of government secrets I’ve tried to figure out but still don’t know. Unless Snowden 2.0 happens, you may never know that operators in a Special Access Program had and analyzed your secrets. You almost didn’t know about what’s been published. Hundreds to thousands of people kept it secret with only one man deciding to publicize it. Other leaks were extremely limited in comparison. That government can’t keep secrets long enough to be effective is a lie. That highly-vetted people don’t tell secrets to the media at TS/SCI level & some of those subvert INFOSEC tech is the norm. Snowdens are exceptions.

      “But the argument that Apple has enabled a law enforcement backdoor seems to miss what Apple has actually done”

      The argument is partly sound because Apple just increased the risk for their users in a way that might facilitate government eavesdropping. Nobody should argue that they did it intentionally for that purpose or law enforcement is using it. We don’t have evidence of that. We do know a ton of keys are now concentrated in one, IT solution. INFOSEC history teaches us that bad things usually happen in such situations. Even lay people know this with some saying about baskets.

      “Apple has devoted enormous resources to locking themselves out. ”

      Enormous resources is enormous overstatement. It’s equivalent of a few IT projects. It’s a better step than many others took, though. I also trust the HSM mechanisms to work better than I trust Apple’s developers. It might also filter out many types of attackers because of difficulty or cost of breaking them. Those are its good points.

      “That’s radically different from what would be required to build a mandatory key escrow system for law enforcement.”

      No, this is exactly one of the things discussed under the Clinton Administration. The declassified CIA report (see 1996 policy) indicated that the main proposal was a third party hold, store, and share with LEO’s the keys. One option was having the companies do it themselves. The standard way to do that in high-security, commercial sector is HSM’s. Smartcards, too, these days. Keys are often moved for disaster planning. So, what Apple’s doing and the prime risk is as old as the Crypto Wars. Actually older if you count military, COMSEC gear given similar practices. So, Apple’s proposal is essentially one of the government’s outside destroying the admin keys. ;)

      https://www.cia.gov/library/readingroom/docs/DOC_0006231614.pdf

      “That’s a good question. Maybe you should ask them? ”

      Risk of loss if their crypto screws up and (most important) the extra costs. The same reasons many companies avoid the same things when they know about them. Same for encrypted email with usable solutions. I knew example where one person forgot their password and that led company to ditch the whole thing for fear more critical info would disappear. People’s photos, music collections, etc are critical to them. Apple probably also values integrity and availability over confidentiality. I encourge more development of almost zero-cost, high-availability solutions to these problems so mature solutions will eventually get uptake. Quite a few academics and small businesses doing good work here, though.

      1. 3

        Reply for EDIT:

        Thoth, who was knee-deep in high-security HSM’s, gave a quick review of my post on Schneier’s blog. Points out FIPS 140-2 Level 3 and up are probably immune to escrow risk on Apple’s end with two of them not allowing plaintext export even for backups. That’s nice. Clears up smartcard PIN angle a bit. He likes the Apple solution if it’s one of them in strict mode.

        https://www.schneier.com/blog/archives/2016/08/friday_squid_bl_539.html#c6731270

        1. 3

          But does the possibility that, e.g., somebody miraculously reconstitutes blended smart cards into private keys add up to “yes, Apple built a backdoor”?

          1. 2

            No. Hence the other stuff in my post about vulnerabilities, malicious admins, key imports, subversion of HSM’s by nation states in threat model, and so on. ;)