1. 49
  1. 8

    I think this is not correct.

    A lot of technical messages are released in tongue in cheek, non-professional, friendly ways.

    This makes technical content accessible to more people. It shares the culture which matters to us, including willingness to taking down establishments when they don’t serve a purpose, referring to history where applicable, and not choosing a new technology just because it’s new.

    When we get more professional, we take a lot of that culture out of it. We take the Unix Philosophy (not a checklist but guiding principles) out of our messaging, out of our designs, and thus out of popular tooling.

    The culture that brought us end to end encryption, and the culture that popularized it, needs to be preserved and shared. If that means sometimes we provide ammunition for those who find us scary, then I’m OK with that. It’ll be better to win on our realities than to adopt a tamer reality to avoid rocking the boat.

    1. 11

      When we get more professional, we take a lot of that culture out of it. We take the Unix Philosophy (not a checklist but guiding principles) out of our messaging, out of our designs, and thus out of popular tooling.

      I’m sorry, but this makes no sense. How does the “Unix philosophy” have anything to do with professionalism?

      1. 3

        They are both us conveying messages.

      2. 3

        I think it’s better to make a case that can convince people who don’t already agree with you. Cheerleading only gets you so far.

        I was super pleased with the announcement at first, but this author makes a lot of really good points. Especially the one about legality - as a matter of fact it’s a reality I constantly bring up in conversations about cryptocurrency, that exploits and “cleverness” can be shut down by a reasonable judge or lawyer, and that’s a feature. I seriously hope Signal’s only action is as @gcupc said, because I like Signal.

        1. 3


          I think Cyberlaw’s main grump is it will not play well in court.

          If Moxie ever ends up in court over this, he has played his cards badly wrong, and the blog post is the least of his problems.

          The aim of the blog post is to make them sleep less well at night, no more, no less.

          If he is sensible he will never trigger this in any jurisdiction where he can be dragged into court.

          In the nasty jurisdictions he is unhappy about, they’d already, right now, quite happily disappear him and the legal process doesn’t even enter the discussion.

          Of course proving it was Moxie and not Cellebrite that injected a backdoor into the state security of one of these places might be a bit hard… and such goons tend to, ahh, verify the claims or disclaimers of whoever is in reach with a rubber hose.

          So if a backdoor did appear in a nasty states security apparatus, the Cellebrite’s salesman can blame Moxie until he is blue in the face… they might even believe him, but the water boarding will just keep on coming.

          Just to be sure.

        2. 7

          The whole thing feels weirdly like an attack towards Signal and using this as an excuse.

          It’s not like Cellebrite are the good guys here and their software is certainly in the greyest zone.

          Also I find it weird that the author calls “Signal” the creator of the app itself, that makes things confusing.

          1. 5

            To pick up a thread from other discussion forums: Is it legal in general to equip a personal phone with software which resists data exfiltration? It is currently legal in the USA to encrypt files on a personal phone, although it’s not yet clear whether the 5th Amendment protects encryption keys. Meanwhile, booby traps can be criminal if they harm the offender; quoting the author:

            They seem to be implying — or at least they seem to intend for the reader, and more importantly Cellebrite and its customers, to infer — that Signal will add “innocuous” code to their app that might, maybe, alter the data on a Cellebrite machine if the phone gets plugged into it. If they’re saying what they’re hinting they’re saying, Signal basically announced that they plan to update their app to hack law enforcement computers and also tamper with and spoliate evidence in criminal cases. When you put it that way, it becomes clear why they were using such coy language and why I bet they’re bluffing: Those things are illegal.

            This seems reasonable and well-supported by case law. Cellebrite’s technology has two stages; the first stage copies data from the target phone and the second stage analyzes that data. Since the analysis isn’t taking place on the target phone, the data ought not to simply exploit anybody who attempts to decrypt it. When the offender is a law enforcement officer, then any exploit I can think of – even the harmless ones – would be quite criminal indeed.

            So, this rules out the idea that we can use anything other than mere encryption to protect ourselves.

            1. 4

              But encryption is not ever mere, meek, or humble, is it?

              Suppose that we have data which we want to protect from exfiltration. Our main protection will be encryption: The data will be stored as pseudorandom ciphertext. Before encryption, though, let’s presume that the data is compressed, so that the plaintext is pseudorandom as well.

              Since the ciphertext and plaintext have similar entropy, the offender is obliged to try not just decrypting the ciphertext, but also parsing it. Since the offender desires many different formats of evidence (video, audio, photo), they are also obliged to try every parser on every ciphertext. A fault in a parser can cause many sorts of behavior; given that we have a survey of vulnerability counts already on just one analyzed third-party component, and knowing that half of those vulnerabilities are denial-of-service, we may safely assume that there is at least one fault which will deny the offender any further access to our data.

              What’s the likelihood that such a faulty input shows up in a pseudorandom file? It’s improbable, sure, but not impossible, and this grants a small modicum of plausible deniability, especially if there are so many faults as to make such parser faults a common user experience. At that point, as the author suggests, XKCD 538 is more relevant than any theory. (Similarly, I split this from my first comment in case this comment might be moderated.)

              But I would (optimistically) imagine that this is neither interference with law enforcement activity, nor spoliation of evidence, simply to have an encrypted file which happens to crash MPEG parsers; the ciphertext is password-protected and no amount of MPEG parsing was ever going to reveal that password, and the password is (in my imagination) securely behind the 5th Amendment.

            2. 3

              The author points out that the language about “aesthetically pleasing” files being injected is coy, and that OWS are bluffing about putting exploit packages into the Signal cache. My guess? They will not inject exploit packages (because of liability), but will inject something that will make police analyst’s lives worse purely through aesthetic means. I’m thinking goatse, tubgirl, and, at a minimum, pigpoopballs.

              1. 3

                that the Signal app is going to be updated so as to hack police computers

                It’s not illegal to buy a dog. It sure is to order it to attack police officer, but having it guard your property is perfectly legal, isn’t it?

                Also, exploiting Cellebrite does not equal exploiting police. Because even Signal authors were able to get their hands on Cellebrite software. Meaning they are saving us against the bad guys. For example Israeli secret service.

                1. 2

                  I think publishing that blog post was more like putting up a “beware of dog” sign than actually buying and training a dog.