1. 38

Note: I’m the author of Monocypher, not the audit. Now I feel like I entered the big league. :-)

  1. 11

    Oh cool, I opened Lobste.rs to find something to read and saw this headlining the site! It was a pleasure to work on this audit.

    1. 3

      I couldn’t let your work go unseen after all. :-)

      1. 3

        But now for the question burning under everyone’s nails: How much did it cost?

        1. 3

          Looking at my application, it cost $7.000, all paid by the OTF.

    2. 2


      Aside from it being a good best practice (which I guess implies a higher level of trust by security-conscious people?) are there any other benefits of being audited? Does this make it eligible to be run in some environments? Or make it a viable option for some standards?

      1. 9

        Thanks :-)

        As far as I know, eligibility tends to increase with the user base: the more users use it, the more people feel safe using it. An audit certainly increases confidence, and with it, eligibility. Personally, the audit is a big reason why I now feel confident recommending Monocypher in a corporate setting. Before the audit, my own conflict of interest (choosing the best option vs choosing my option) always gave me pause.

        Standards are different. Monocypher only implements existing standards & specifications. It could be used as a reference implementation, but that’s about it.

        1. 6

          Does this make it eligible to be run in some environments?

          The audit is unfortunately relatively meaningless in that context. Highly regulated environments tend to insist on either ISO or NIST standards and require specific certification for them. Monocypher does not implement any of them (though Ed25519 may become part of FIPS 186-5 the way things have been going).

          1. 3

            Yeah, there tends to be a fairly long delay between “good” and “standard”. I get the vibe that standardisation bodies don’t trust themselves to assess cryptographic primitives and constructions. Being overly conservative is the only rational choice in this circumstance.

        2. 2

          Congratulations to them! I hope this isn’t the last one!

          1. 3

            For Cure53? I sure hope they continue auditing lots of stuff.

            For Monocypher? As much as I’d like more audits, I believe the library has stabilised enough that I won’t need to initiate another audit for the next 10 years.

            1. 2

              Ah! Well if it’s not going to change a lot in the future, then indeed it’s nice that the community will be able to rely on those results for a significant period of time!

          2. 1

            May I be debunked around post-quantum proof cryptography: Is it something to bother this early? I feel like this is up to CryptoPeople to tell to NonCryptoPeople about that rather than the other way around.

            I have the impression that it is more about studying well how ciphers face the threat than finding the Golden Bullet.

            1. 3

              Should we be bothering with research and serious implementations? Yes. Quantum computers are an inevitability and it’d be nice to be ready when they’re there.

              Should we be putting them in production? Probably not. Many NIST post-quantum cryptography candidates are still getting attacked left and right. And there’s a non-zero chance that the result will still either be impractical, patent-encumbered or both.

              1. 2

                Being able to build large enough quantum computers to break current asymmetric cryptography is definitely not inevitable. There are many issues that may end up making it physically impossible to make such a computer that runs long enough to do such a computation. Of course, it is prudent to assume it will happen and develop resistant cryptography in the meantime.

            2. 1

              Would there be benefits on using it for existing projects? Such as the classics (TLS, SSH, PGP…). Or is the benefit only noticeable for new projects, for which there is not yet a (too) large crypto code base in use?

              1. 2

                Monocypher is focused on a small number of modern primitives. That makes it incompatible with most of what those old standards need. No AES, no RSA… So I’d say new projects only.

                In addition, Monocypher is a low level crypto library. A toolkit with which you can build higher-level protocols For instance, I’m currently working on authenticated key exchange with Monokex. Or you could build Noise.

                1. 2

                  Forgive the possibly ignorant question, but would Monocypher be useful for encrypting traffic between two servers? I’m in need of encryption in a distributed system where SSL certificates would be unreasonably expensive and self-signed is not acceptable.

                  1. 3

                    It would be, but you’d need to implement an existing protocol (such as a suitable Noise pattern) that provides the security guarantees you want.

                  2. 1

                    I like the idea of small, strongly built, loosely coupled building blocks on top of which implement higher-level parts.

                2. 1

                  As a non-cryptographer, I am curious about the promises blake3 offers, and whether it is worth considering it instead of blake2. I saw blake2 in monocypher?

                  Given it is a crypto primitive, it may not work as simply as bumping a dependency from version 2 to 3 (unique output length, variant…) or it may! Any major change like this one could also require a new audit (no idea).

                  1. 3

                    Blake3 is Blake2s, with 2 differences:

                    • The core loop can be run in parallel. That enables vector instructions, making it much faster on modern processors.
                    • The number of rounds is reduced. This reduces the security margin, but it is also faster.

                    Personally, the reduced rounds make me a little nervous. The parallelisation however is a killer feature. This allows Blake3 to stay 32-bits and fast on big processors. That makes it a one-size fits all algorithm, much like Chacha20.

                    Bumping from Blake2b to Blake3 would not require a new audit in my opinion. Blake3 is a symmetric primitive based on a RAX design (Rotate, Add, Xor), which makes it easy to implement, and very easy to test: just try every possible input length from zero to a couple block length then compare the results with a reference implementation.

                    Now if I were to redo Monocypher now, I would consider Blake3. There’s just one problem: Argon2i, which is based on Blake2b. I could repeat the shenanigans I did for EdDSA (allow a custom hash interface, provide Blake3 by default, as well as a Blake2b fall back), but that would mean more optional code, a more complex API, all for a limited benefit. I believe Blake2b makes a slightly better trade-off for now, even though many of my users are embedded programmers.

                    1. 4

                      and very easy to test: just try every possible input length from zero to a couple block length then compare the results with a reference implementation.

                      There are machine-parseable test vectors that test various edge cases as well.

                      The existing BLAKE2b API in Monocypher would need to be broken anyway because of the mandated API design with a context string.

                      Edit: Also, why is Argon2i an issue? As far as I’m aware, Monocypher implemets it from spec, which is notoriously incompatible with the reference implementation. So if Monocypher is already incompatible with every other implementation under the sun (which are all just derivatives of the reference implementations), why would you bother caring about the hash function used in Argon2i?

                      1. 2

                        Monocypher is compatible with both the reference implementation and Libsodium. The divergence with the spec is explicitly noted in the source code.

                        Also, one of authors said the specs “will be fixed soon”, so that’s a clear sign that everyone should conform to the implementation error of the reference specification. (And yeah, he made that promise over 3 years ago, and the specs still have not been fixed.)

                      2. 1

                        Thank you for the overview. I understand the balance better now.

                        And obviously, thank you for Monocypher!