1. 22
  1.  

  2. 12

    One of the most infuriating thing about this is that it took more source code than a ChaCha20 implementation. Heck, even Dolbeau’s optimised implementation might be a bit smaller. And come on, it’s not that hard to read or watch introductory material on the subject, get a feel of what we’re getting into.

    This also reminds me of what I wrote a few years back:

    The running theme seems to be that cryptography is a kind of Dark Magic, best left to anointed High Priests. Us mere Mortals cannot hope to wield it safely without first becoming one of those vaunted Experts — a futile endeavour for those of us who know their place.

    While being a good first order approximation, that kind of gate keeping is problematic on a number of levels. First, some people back in the real world have needs that current crypto systems don’t always fill. Insulting them with “you don’t know what you are doing” does not help. Second, it has a couple perverse effects:

    • It stops many reasonable people from studying the subject.
    • It causes less reasonable people to charge ahead anyway.
    • If cryptography is an Art, we could make things up as we go along. And invent “unbreakable” ciphers that are anything but.

    If you blindly keep the gates, people will climb the walls.

    1. 5

      You’re right that there’s a lot of gatekeeping, but claiming that something amateurish is “probably the most secure PRNG available” is arrogant and dangerous. Unfortunately, it’s probably also a better way of getting your code tested than asking people to review it.

      1. 2

        I agree with you on all counts. Here I believe this is a case of “If cryptography is an Art, we could make things up as we go along”.

        Speaking of which, I re-listened to The Great Roll Your Own Crypto Debate linked at the end of this post, and noticed Thomas Ptacek saying something eerily similar:

        The thing that I’m very worried about is the no-path thing where we need to democratize cryptography to the point where anybody can sit down, just, “I have a problem, it needs cryptography, here’s a shelf full of crypto potions, I’m just going to pour a couple of them into the cauldron and, you know, see what happens”

        And I think that that’s not an abstract or speculative concern. I think it’s the norm. I think it’s what really happens in real systems. and it gets people into trouble.

        I believe he’s right about the potions & cauldron. But I disagree about the cause: in this desperate efforts to keep the reckless and the incompetent out, advocates like him paint cryptography as a kind of black magic reserved for a higher caste. That’s how you get some Sorcerer’s Apprentice trying to put stuff into the cauldron, to catastrophic result: they think they have an idea what they’re doing, in part because no one explained why putting things into the cauldron is dangerous.

        I don’t know for sure that my approach is better, but I do believe it avoids the reactance induced by a blanket “don’t roll your own crypto”.

        1. 2

          The potion analogy reminds me of the carefree days of children’s chemistry sets, when we gave kids a bunch of hazardous chemicals and an instruction booklet and let them have at it. This seemed like a good idea, then it started seeming like a bad idea, and educational toys became much more tame.

          On one hand, I’m happy that toys are less dangerous now. Some kids are reckless! Whether it’s because they can’t see danger or because they think they’re immortal, there are some people you don’t want goofing around with chemicals. We did a good thing by steering them away from chemistry.

          On the other hand, I mourn the potential chemists that we lost. There were kids that had the aptitude and the right temperament for chemistry, but who never discovered their talent for it. Maybe that chemistry set was their only opportunity to get hands-on with it, or maybe our cautionary language scared them away from seeking out other opportunities. We’ll never know for sure, but maybe one of those kids would have cured cancer when they grew up, and cracking down on chemistry sets caused a net loss of life.

          (to be fair, we might have also gotten another Thomas Midgley Jr., so maybe it’s a wash)

          Getting back to cryptography, the challenge is how do you keep the reckless from doing damage, without discouraging people who could make the world a better place? You can’t really gate access to one group but not the other. Perhaps the best you can do is teach humility: you are as capable of making mistakes as anyone else – especially when you think you’re correct – and cryptography is about defending against that, just as much as it is about defending against external attackers.

          1. 2

            (to be fair, we might have also gotten another Thomas Midgley Jr., so maybe it’s a wash)

            Thanks for the link to Midgley’s Wikipedia article.

            I find it fascinating that the summary of his work pin him as solely responsible for the negative effects of lead as a fuel additive and Freon. Surely some blame must be fixed on the economic systems that made these products desirable and distributed?

            1. 2

              Surely some blame must be fixed on the economic systems that made these products desirable and distributed?

              That would veer dangerously close to blaming capitalism itself. I personally wouldn’t disagree, but seriously discussing alternatives to the economic system we live in is way outside the Overton window.

              1. 2

                Eh, kinda. I think it’s more the pernicious attempt to shift blame from economics to individuals (see also “carbon footprints”).

                I also don’t think it’s something Wikipedia can be blamed for. It’s obvious that this accomplished chemist was involved in the development of two major products that happen to now be seen as really bad for the environment, and pattern matching in hindsight affix the blame uniquely on him as an individual. Wikipedia just reflects this pop-history view.

    2. 11

      Then added “probably the most secure PRNG available” to the read me. OK let’s go.

      Ahhhh, Cunningham’s Law combined with the Dunning-Kruger Effect, glorious. This is why my PRNG lib has in the readme “if you use this for crypto you will get what you deserve”.

      Results are actually pretty cool! I wish more people showed how to break PRNG’s. Maybe I should make more outrageous claims in my readme files?

      1. 5

        TLDR someone decided that by merging two xorshift rngs - both explicitly not cryptographically secure - they would magically become cryptographically secure. When told that didn’t work they doubled down on that despite clearly having no actual understanding of cryptography.

        There’s a lot of talk here about gatekeeping cryptography, but there’s a very real reason for that. You absolutely can come up with your own secure cryptographic scheme, but doing so means you have to have an incredibly solid understanding of every primitive involved - not just “don’t use X”, or a crypto 101 intro to rsa, you have to know the math, and you have to know the practical implementation details.

        The thing is that generally once people have that level of knowledge, they then deliberately choose not to design cryptographic protocols themselves, because by that point they know how hard it is.

        So you get his absurd case where people who do not know what they’re doing claim that saying “don’t do it yourself is gatekeeping” just because they did a university crypto course or read a blogpost once and therefore “know” crypto.

        Gatekeeping implies some in group exists trying to keep everyone else out. What you actually have is a group of people who are the experts, who would be the gatekeepers, saying “we also would not do this ourselves”. Any time I have worked in any situation that necessitates developing a new cryptographic protocol it require significant numbers of people: mathematicians, cryptographers, in addition to the engineers, and generally includes external auditing as well.

        If you fuck up crypto you can cause incredible harm to people, and fucking up crypto is incredibly easy. If even the experts say “we don’t do it ourselves”, maybe stop accusing them of gatekeeping and ask if you fully understand the issues.

        1. 6

          Since I started seriously dabbling with cryptography in winter 2016 (ultimately producing Monocypher and documenting Elligator), I identified 3 main groups in those discussion forums:

          • Cryptographers: Folks like Mike Hamburg, who know the maths quite deeply and maybe even contributed something significant like Curve448 or Ristretto. It’s a small, mostly quiet group.
          • Guardians: Folks like Thomas Ptacek, who spend much of their time and clout keeping the barbarians out of the gates. They don’t hold their criticisms back, and they don’t seem to mind the occasional false positive.
          • Acolytes: Who follow the guardians, blindly agree with everything they say, parrot their wisdom without nuance, and often argue from ignorance (they don’t know stuff, and tend to assume it’s so complex you don’t either).

          Me, I’d say I’m a Maverick. I learned a bit, realised I could actually push the envelop, and pushed through despite all the warnings telling me not to. One reason I ignored those warnings is because they were so obviously exaggerated.

          This netted me some scathing and unjustified criticism from Gardians who clearly didn’t bother to even look at my work. One of them even remembered me as having made “a lot of mistakes”, and politely suggested I shut up forever because of those alleged mistakes (for the record, I made one significant mistake, and one of the many reasons for this particular one was that when I asked around how to do the thing, I got crickets, with nobody to tell me the thing was dangerous).

          If even the experts say “we don’t do it ourselves”, maybe stop accusing them of gatekeeping and ask if you fully understand the issues.

          Personally I accuse people of gate keeping because they wouldn’t let me in, even as I clearly demonstrated enough competence to design and write an entire cryptographic library that later held up to a formal audit. I’m past asking myself whether I understand the issues. I do, and I proved it. (Edit: OK that was unnecessarily confrontational. I wrote it as if this remark was directed at me, while in fact it’s directed at the “Xor Shit” authors of the world.)

          Now what’s a beginner to do? I would say first read my piece and seek out introductory material. By then the beginner should have an idea of what they’re actually capable of, what they’re actually interested in, and where they want to go from there. And if some of them turn Maverick, they’ll likely be the kind of Maverick we want, instead of the kind of Nuclear Boy Scout I’ve once been compared to.

          1. 2

            Acolytes

            Indeed, but that’s not unique to crypto. It’s a fairly generic “appeal to authority” and people will do that for far more than cryptography (appeal to Linus is another classic). That said it’s also accurate people who just reflexively say “don’t do X” are annoying, especially when they don’t offer an alternative.

            Now what’s a beginner to do? I would say first read my piece and seek out introductory material.

            You work with experts, maybe even becoming an expert, and even when you are an “expert” you continue to work with other experts whenever you need to create anything new.

            The point isn’t “you aren’t the chosen few, therefore you cannot write cryptographic code”, it is that no one can.

            Writing cryptographic code is uniquely challenging, and uniquely dangerous: it’s full of numerous foot guns, that can be exceedingly subtle, and fucking up crypto can cause extreme harm to users. Developing any new cryptographic system (new primitives, new schemes, or new protocols, etc, or even just a new implementation of something that exists) needs to be work done with numerous people who have experience working with cryptography.

            I want to be clear here: I may not be the most expert of all experts, but I have done a lot of cryptographic work (cryptographic engineering?) professionally and I will still always choose an existing protocol unless there’s an unavoidable obstacle that requires a new system. If a new system absolutely is needed, I would never consider designing or implementing it on my own.

            1. 1

              Writing cryptographic code is uniquely challenging, and uniquely dangerous

              I’m not sure I agree about “uniquely”. What is unique about cryptography is the culture. Which I attribute mostly to its roots in military applications. At the same time there are so many more ways to screw things up than the cryptography. Buggy compilers can have utterly unpredictable consequences, and the vulnerabilities they can potentially induce could reach pretty much everywhere. What about parsing untrusted input? PDF readers are dangerous too. There’s also the use of unsafe languages, which often cannot be avoided for reasons of compatibility, performance, or available competence — at least short term. Something similar could be said of core infrastructure code that talks to the hardware.

              Cryptography have challenges for sure. It even has specific challenges (that with side effects, or the fact that security reductions are so damn delicate, or the fact that it’s so easy to have a working happy path yet no real security at all). But I’m not sure it’s that much harder than any other specialised programming field, or that screwing it up has more dire consequences than in all other fields.

              Would I actually write cryptographic code? Yes and no. I can write primitives. I’ve been taught how to test them. I can make sure there’s no timing attack on a given platform. But when it comes to inventing new protocols I’m not there yet. I think I can do it eventually, but there’s no way I can commit to a deadline here. But I do know the competence threshold I must cross: the ability to write proper security reductions, at least in the symbolic model.

              1. 2

                I’m not sure I agree about “uniquely”.

                If all of the code in a compiler is correct, the output is correct. If all of the code in a cryptographic application is correct, the encryption can still be fundamentally broken.

                What is unique about cryptography is the culture. Which I attribute mostly to its roots in military applications.

                Cryptography has been decidedly non-based on the military: governments have fairly universally fought public access to cryptography. Modern (>70s) has been mostly driven by commerce

                Buggy compilers …. What about parsing untrusted input? PDF readers are dangerous too….

                Yup, and sufficient testing could in principle catch those errors. No amount of testing will not identify flaws in cryptographic design.

                The core issue is that “correctness” in more or less every field other than cryptography is “code correctness”, whereas cryptography is “design” correctness. No amount of safe language, or careful development, will save you if you have put the primitives together in the wrong way. And unlike other complex systems where combining things incorrectly results in broken output, combining things incorrectly in crypto can easily produce a result that successfully encrypts, successfully decrypts, pass all tests for “does the output appear random”, and yet still be broken.

                1. 4

                  I think the thing that makes crypto unique is that it combines two requirements that are often at opposite ends of skill sets: mathematical rigour and low level systems understanding. Making a tiny mistake in the maths can fundamentally break the crypto system. Not understanding the details of the target machine can introduce side channels.

                  Things like RSA made this much worse, because it was easy to understand the basic mathematics, but very hard to write a side-channel-free implementation of the big integer arithmetic that it needed. Worse, picking primes to use as keys had a bunch of pitfalls, where the level of maths needed to understand the basic system led you to pick weak keys.

                  I think this has changed somewhat with newer algorithms. A lot of newer algorithms are designed specifically to operate on random data as keys, so that an implementer doesn’t have to understand how to go from some seeded input to something the crypto system understands. Avoiding side channels is still hard though and is the reason that most modern CPUs have things like AES instructions: they may be faster, but they definitely will execute in constant time (which is much easier to guarantee with a circuit than with a sequence of instructions).

                  1. 1

                    I think the thing that makes crypto unique is that it combines two requirements that are often at opposite ends of skill sets: mathematical rigour and low level systems understanding.

                    Hey, I didn’t think of that. For each one I’m like “duh of course we need that”, but I kind of forgot how rare the combination actually is.

                    Avoiding side channels is still hard though and is the reason that most modern CPUs have things like AES instructions: they may be faster, but they definitely will execute in constant time (which is much easier to guarantee with a circuit than with a sequence of instructions).

                    I don’t know about other side channels, but timing attacks at least are not too hard to deal with if you stick to the easy primitives (not AES). RAX designs like SHA-2, ChaCha20, or BLAKE2 for instance are naturally immune to timing attacks. Bignum arithmetic is still delicate, but if you stick to pseudo Mersene primes that’s fairly manageable (the real Unspeakably Awful Indescribable Horror there is partial carry propagation).

                    Multiplication is enough of a bear though that I chose to ignore it for now.

                  2. 1

                    If all of the code in a compiler is correct, the output is correct. If all of the code in a cryptographic application is correct, the encryption can still be fundamentally broken.

                    Not sure what you’re getting at here. In my book correct code is code that fulfils its requirements, and as such cannot be broken. And obviously (to me), code that correctly implements an incorrect design is still incorrect.

                    And unlike other complex systems where combining things incorrectly results in broken output, combining things incorrectly in crypto can easily produce a result that successfully encrypts, successfully decrypts, pass all tests for “does the output appear random”, and yet still be broken.

                    Indeed. That’s why we need proofs and security reductions. You don’t need to twist the definition of “correct” to explain that, though.

                    1. 1

                      Not sure what you’re getting at here. In my book correct code is code that fulfils its requirements, and as such cannot be broken. And obviously (to me), code that correctly implements an incorrect design is still incorrect.

                      The whole point here is that you cannot tell if a design is incorrect, and therefore broken, if you don’t have a lot of additional understanding and knowledge of cryptographic systems. Coupled with an incorrect design resulting in correct appearing behaviour.

                      But I’m done going through this, you’ve already stated that you don’t accept that designing correct cryptographic systems is more difficult than other systems, or that it requires more people to do so safely.

                      1. 1

                        you’ve already stated that you don’t accept that designing correct cryptographic systems is more difficult than other systems,

                        Not quite. I fully admit that cryptographic protocol design ranks quite high in the difficulty contest. I’m pretty sure however that some (possibly not many) other programming jobs are even more difficult. Now designing a cryptographic primitive like a stream cipher or a hash, I put in the “flat out impossible” category. That stuff requires an entire research community years or even decades of work.

                        Your wording (“cryptographic systems”) could conflate the two categories, and I do insist they are nothing alike.

                        1. 2

                          Your wording (“cryptographic systems”) could conflate the two categories, and I do insist they are nothing alike.

                          Which is a problem, because the whole point of the “don’t roll crypto yourself” argument is that you can take a bunch of primitives, implemented hypothetically 100% correctly with no leaks of any kind, but if you put them together wrong, it’s broken. A huge amount of the “stuff requires an entire research community years or even decades of work” is the protocols using the primitives, not just the primitives themselves. The reason we use TLS today is because the SSL 1 and 2 protocols resulted in attackable encryption, despite using correct primitives, and despite huge communities working for years to develop them. TLS took years to design and develop despite the entire set of primitives already existing.

                          The exact problem people are concerned about when saying “don’t roll your own crypto”, is exactly what you’re demonstrating here: “The maths for primitives is hard so I won’t do that, designing a protocol just means saying how to paste the output of those existing operations together so that’s easy”. It’s an incorrect belief that leads to the simpler mantra of “don’t roll your own crypto” vs “use an existing crypto protocol if possible, and try to avoid using primitives directly” - the latter might be “less gatekeepery” but it’s wordier, and would likely make people less concerned about using a bunch of primitives together directly as no one is just say “don’t do this”. Just because something seems easier to understand doesn’t mean it is easy or simple - I get that human brains like to make that leap, it’s why people keep creating new systems using RSA they “understand RSA” so it choose it over the more complicated ECC, despite ECC being what you want nearly 100% of the time, but you have to fight that kind of internal bias.

                          1. 1

                            Which is a problem, because the whole point of the “don’t roll crypto yourself” argument is that you can take a bunch of primitives, implemented hypothetically 100% correctly with no leaks of any kind, but if you put them together wrong, it’s broken.

                            Not historically. The saying used to refer to the invention of new primitives: “don’t roll your own crypto, use AES instead”. And then people used ECB, forgot to authenticate, fell prey to length extension attacks, used variable time comparisons… so eventually the saying extended to designing your own protocols, and we started saying “use authenticated encryption”, “use authenticated key exchange”, or “use TLS” (not sure that last one is such a good idea, considering it basically means grappling with OpenSSL).

                            Recently I saw the trend go even further. See this post from @cendyne where @soatok writes:

                            Tech bloggers, whether or not they are also cryptographers, are not your cryptographer. At minimum, they don’t know your threat model or systems designs. This is an incredibly specialized domain that’s easy to get wrong. Hire a cryptographer.

                            So:

                            • Don’t invent your own primitives. Okay.
                            • Don’t invent your own protocols or constructions. Why not.
                            • Don’t implement primitives & protocols yourself. Many have to anyway.
                            • Don’t use cryptography without a specialist. Seriously?
                            1. 1

                              The saying used to refer to the invention of new primitives: “don’t roll your own crypto, use AES instead”

                              AES is the primitive. Protocol is the thing that defines how you use said primitives.

                              And then people used ECB, forgot to authenticate, fell prey to length extension attacks

                              e.g. protocol issues

                              • Don’t invent your own protocols or constructions. Why not.

                              Because an incorrect protocol or construction completely undermines any protection you get from primitives

                              Don’t implement primitives & protocols yourself. Many have to anyway.

                              Less sold on this being a no go - this would to me be a “use an existing implementation if possible” but not a don’t do this ever.

                              Don’t use cryptography without a specialist. Seriously?

                              I don’t see anyone saying that.

                              1. 1

                                Looks like we pretty much agree there. One last nitpick:

                                Don’t use cryptography without a specialist. Seriously?

                                I don’t see anyone saying that.

                                That’s how I interpreted the quote I linked to: “This is an incredibly specialized domain that’s easy to get wrong. Hire a cryptographer.” Also confirmed (and nuanced) by the author himself in a sibling comment: “If you’re doing anything more sophisticated than “use HTTPS” or “enabling disk encryption”, you’re better off having a specialist review your use case, threat model, and implementation than not.”

                                And sadly there is a complexity threshold above which a security specialist indeed is needed. So it’s not like I completely disagree with it. But the majority of use cases (client-server security, peer-to-peer security, registration/login, volume encryption, file encryption, secure messaging…) should all have an easy enough to use solution that can be picked up by any competent programmer. And that solution should be easy enough to find too. I’m afraid we may not be quite there yet however, and that’s a problem. But I’m confident that with minimal training (a couple days, or even hours), anyone could handle the main uses case by themselves.

                              2. 1

                                If you’re doing anything more sophisticated than “use HTTPS” or “enabling disk encryption”, you’re better off having a specialist review your use case, threat model, and implementation than not.

                                If someone wants to, I dunno, encrypt shit willy nilly in a SQL database? Fine, but don’t blame us if you do it wrong. We didn’t give you permission to do naive and easy things in our writing.

                                Can’t hire a specialist? Don’t have time to become one? You’rr going to do what you want to at that point.

                                1. 1

                                  I recall something like that applying to pretty much all domains. Of the top of my head the quote went something like this.

                                  When a projects has special demands on databases, we call in database specialists. When it has special demands on the operating system, we call in OS specialists. But one thing pretty much every project place special demands on is programming languages, and yet we don’t call in programming language specialists.

                                  I would agree that encrypting an SQL database is definitely “putting special demands on cryptography”. But I do hope we can lay out a relatively small number of standard practices that would cover most of the use cases.

                                  Fine, but don’t blame us if you do it wrong.

                                  Whoever does that seriously need to rethink their life choices.

                                  1. 2

                                    The trouble is, you kind of have to become somewhat of a specialist to encrypt stuff in a database well. I wrote this to make the topic more accessible, but it ended up becoming like 40 pages of Print to PDF in length.

                                    https://soatok.blog/2023/03/01/database-cryptography-fur-the-rest-of-us

                                    1. 1

                                      Yes, I’ve read it, it’s excellent — and quite terrifying. Unfortunately I have to agree with you there.

                                      Still, I get the feeling cryptography is getting kind of a special treatment: it’s not the only high-stakes field where subtle hard to catch mistakes can have catastrophic consequences. Other stuff is just as problematic, starting with anything that parses untrusted input: PDF readers, XML parsers, video players… even if you use a memory safe language there is still room for error.

                                      And some errors in cryptographic systems have nothing to do with cryptography. Say for instance a programming student tries to implement SHA-256 on the newest fancy micro-controller. The thing’s naturally constant time, so it should be easy, right? Well it is, if you know how to test. And in my experience few programmers do.

                                      So here we have a cryptographic task where the biggest difficulty is not cryptographic. I suspect one reason this difficulty even exists is because we didn’t take everything else seriously enough. If we did people would know how to test. Heck, I even suspect that if we took the other programming fields seriously enough, much of cryptography (not all) would be a lot easier.

            2. 6

              Schneier’s Law strikes again:

              Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can’t break.