1. 62
  1.  

  2. 15

    It’s a clickbait title, but there’s a real cryptographic bug in there: hashing of phone numbers for privacy isn’t doing anything. Such hashes are vulnerable to a dictionary attack. So effectively, AirDrop and WiFi sharing are broadcasting user’s phone number.

    1. 2

      I think the most serious one there is getting Wifi passwords from iPhones by spoofing a friend.

      1. 5

        I think the most serious one there is getting Wifi passwords from iPhones by spoofing a friend.

        No, the phone numbers are the one that’s worse. If you are sharing wifi-passwords with your friends, you know when you are doing that and you will probably click no when you are not doing that. Furthermore: Impersonating a friend requires a lot of reconnaissance.

        The phone-numbers however, are continuously sent out, which means that every tracking company in stores or subways will set up a SHA256(phonenumber):phonenumber database.

        This differs from the previous situation where they only had MAC-addresses in that they now know for certain with high probability that this MAC-address belongs to that phone number, which can be connected to any other data the user enters somewhere else.

        1. 5

          Update: As a test, I’ve written a quick and dirty java-program which calculates the SHA256-hash of every cell-phone number in The Netherlands. This program simply runs through all 8-character suffixes of all phone numbers. The resulting csv-file is 7,3 GB in size and it took my personal laptop about 162 seconds to generate this file. A simple grep-search takes about 17 seconds. Note that I haven’t optimized the code at all and that my laptop is running on a i7-6500U (@2,5GHz). This means that my private laptop can run through the SHA256 hashes of virtually all phone numbers in the world in about 4 days and that the resulting database-file would only be about 700 GB in size. More beefy machines will certainly be faster.

          This should be the final nail on the coffin for everyone who thinks that hashing a phone number is adequate for privacy protection, or that sending out a hashed version of the phone number is acceptable. It simply isn’t and the whole is capable of tracking individuals based on this. It doesn’t matter which hash-function you use, there simply aren’t enough phone numbers.

          1. 2

            I put Troy Hunt’s Pwned Passwords dataset (500 million SHA1 hashes) into SQLite, and by storing them as primary keys, lookups only take 0–1 milliseconds. Presumably you’d get similar results with however many SHA256 hashes. :-)

            1. 2

              Presumably you’d get similar results with however many SHA256 hashes. :-) Nice!

              I’ve imported the phone numbers into sqlite, but importing and indexing them took much longer than the actual generation. :-) Lookups are also instantaneous. Which is as expected.

            2. 1

              Switching to something like scrypt should solve this issue, right?

              1. 1

                If you used a persistent salt per-device, yeah.

                1. 1

                  Yes this would solve the issue for just about any hash function. But then you won’t need the phone number in there as well. If you go down this path you are essentially using device id’s or something simmilar, but you might just as well use some random id that you refresh after a certain period.

                2. 1

                  No, switching to a different hash function won’t solve this issue. The fact that you are using phone numbers is the cause of this problem and it does not matter which hash function you use.

                  I’ve demonstrated this with AES256 because that is what Apple uses. You will get similar results with AES512, scrypt, whirpool or whatever other fancy hash function you want to use.

                  1. 3

                    Scrypt uses salting so precomputing one set of hashes won’t work. Furthermore you can select scrypt parameters so that computing one hash takes ~1s on modern (iOS) devices. Shouldn’t this be good enough to prevent this kind of attacks?

                    1. 1

                      Well even if a scrypt hash takes about one second, there is no one stopping me from running it 10 million times. Sure it would take me 115 days to build the table on a single core, but it would take me one day on 115 cores for an entire country. Given the fact that I already have access to about 40 cores at my own volition at home using scrypt would also only be a means of delaying the inevitable and is therefore kind of useless in this use case.

                      The only way to solve this, is by using a device specific seed, but if you have something like that, then why would you even put the phone number in there or bother with using scrypt at all?

                      I’ll sya it again: The phone numbers are the problem and as long as you are using those, the hash-function simply won’t matter.

        2. 9

          The title is a huge stretch based on the content. Still an interesting post.

          I’m not worried about having my phone number lifted by passers by, I’m sure even a mildly dedicated person could recover my phone number with only my full name. And that doesn’t really bother me.

          I’m more curious if these BLE pings could be used to reliably track people en masse. As an uninteresting individual, I’m much more worried about abuses of mass surveillance than targeted attacks.

          1. 4

            I’m more curious if these BLE pings could be used to reliably track people en masse.

            Yeah, probably: https://www.theverge.com/2019/6/3/18647146/apple-find-my-app-tracker-friends-iphone-wwdc-2019

            Apple is going to help people track their things and their loved ones with a new macOS and iOS app called Find My. At its annual Worldwide Developers Conference today, the company announced that the new app will combine Find My Friends and Find My iPhone. The idea is that Find My will be a single place to track everything, including people and Macs. It’ll be available on both iOS and macOS.

            The tracking works even if a device isn’t connected to Wi-Fi or a network, the company says. Macs will send a secure Bluetooth signal occasionally, which will then be used to create a mesh network of other Apple devices, so people can track their products. A map will populate of where the device is located.

            1. 1

              Whether it’s actually private in practice or not is up for debate (and will be seen when iOS 13 enters widespread use), but Apple certainly didn’t ignore the privacy concerns when it came to the Find My implementation: https://www.wired.com/story/apple-find-my-cryptography-bluetooth/

            2. 3

              You might care more if your phone number got ported out and then used to reset all your online service account passwords, which might then be held for ransom or otherwise cause you to have a bad day. It happens.

              1. 5

                With my provider my phone number can’t be ported unless I’m physically present with photo ID. Again, since my phone number is trivially available, I don’t consider its secrecy valuable for security.

                1. 6

                  With my provider my phone number can’t be ported unless I’m physically present with photo ID.

                  You put way too much faith in your provider.

                  https://www.youtube.com/watch?v=caVEiitI2vg

                  Social engineering attacks are the single most common attack vector.

                  I say this as a former pentester.

                  1. 6

                    Other than ID and paperwork, in my country porting requires time (more than a week) and sends notifications. The SIM from the carrier I was porting from received SMS notifications after signing the paperwork. And I got a call from that carrier begging me to stay and asking for confirmation.

                    Nothing is perfect but not everything is as terrible as the horror stories involving U.S. carriers where a number was ported quickly and suddenly with zero communication via the old SIM prior to the switch.

                    1. 4

                      And you make way too many assumptions. You have absolutely no idea who I am, what I know, or what my situation is like. Yet you presume to lecture me about social engineering.

                      1. 3

                        In that case, do the rest of the world a favor and share with us the name of the one company on earth that is demonstrably absolutely immune to social engineering attacks. I imagine a lot of people would like to become their customers.

                        1. 2

                          If I have a sim card say from Sudan, where customer service workers aren’t likely to speak other languages than Sudanese Arabic and I live in say Germany, and conduct most of my life online in German, then it’s more likely my attackers are gonna be German as well, and for them it’s almost impossible to social engineer their way into porting my sim card from Sudan.

                          We shouldn’t turn legitimate security advice turn into platitudes by stripping it out of context, and by being too generous in our assumptions about others.

                2. 3

                  I think the most serious one there is getting Wifi passwords from iPhones by spoofing a friend.

                  1. 4

                    I agree, but that notably requires their phone to be unlocked. Unfortunately I can easily imagine people blindly accepting without paying attention, or understanding what they’re doing.

                3. 2

                  Reading through airdrop_leak.py in their repo, they don’t release the iMessage URL and what I assume is a homebrew Python SDK for the Truecaller API. The comments say it’s because they used reverse engineering.

                  You can presumably get the iMessage URL from a packet capture (albeit with a HTTPS MitM). For the Truecaller API, I assume they only reimplemented an existing JavaScript SDK in Python. If that’s the case, how could this, and not the reverse engineering of Apple’s BLE protocols, be considered RE?

                  Does anyone know where I can find documentation on what’s considered reverse engineering and what isn’t in the eyes of the law? I’m wondering if the publisher is just being careful or if the line is really that low.

                  1. 1

                    What happens if one disables AirDrop from the Settings > Restrictions menu? They don’t even mention that as a possibility.