1. 59
    1. 14

      But even so, systems using the LUKS2 header format used to default to argon2i, again not a memory expensive KDF. New versions default to argon2id, which is. You want to be using argon2id.

      Please forgive me, but what the fuck is this?

      I’ve implemented all versions of Argon2, and the claim that any one variant is somehow not memory hard, while the others are are… well, new. Here’s what I’m aware of:

      • Argon2d is not constant time, and an attacker observing timing could potentially reduce its strength down to a couple Blake2b calls (not memory-hard at all). Without timings however it’s the strongest of the three.
      • Argon2i is constant time, and immune to timing attacks. However it does so by having publicly known memory access patterns, which effectively weakens it memory hardness and in practice give attackers a bigger advantage. Which is why in practice for strong protection we use 3 passes instead of just one.
      • Argon2id is a blend of the two. Some of it (the first half of the first pass) is constant time just like Argon2i, and the rest is not (just like Argon2d). As far as I’m aware this kind of gives it the mathematical strength of Argon2d, but could be reduced down to the strength of the half of its first pass if attackers can access timing info. Simply put it provides some side channel resistance, but not as good as Argon2i.

      Now I’m assuming perfect timing attacks. I’m not aware of those having been performed, or even being possible at all. So I don’t know the best trade-off there. But as far as I know it is highly threat model dependent. It is not clear to me which would be better. I’m pretty sure about one thing though: Argon2i is not a bad default. Just use 3 passes with as much memory as is tolerable (I personally set my password manager to 1 second), and you’re done. Argon2id is better when you don’t fear timing attacks too much, but unless there’s a new attack I’m not aware of Argon2i remains memory hard.

      Summoning @soatok in case I missed something, and here’s a link to /r/crypto as well.

      1. 4

        So the takeaway is “General usage: pick argon2i. For disk encryption specifically, argon2id is stronger.” because timing attacks are not a concern.

        1. 3

          I could advise something like that. I do have to concede however that lately Argon2id tend to be recommended as the default. Libsodium change its default from Argon2i to Argon2id a few years ago (likely 2017), and RFC 9106 does not recommend Argon2i at all. (In fact the RFC considers Argon2id to be the primary variant. Funny that historically it was the latest.)

          For some time there was this idea that Argon2id was immune to timing attacks. If that were true it would utterly dominate Argon2i. Unfortunately it’s not that simple, so I need a deeper explanation. I don’t know, maybe there currently is no good side channel attack on Argon2id? Or maybe there is, but the chances of timing attacks are low enough that it’s wort the trade-off?

      2. 3

        Note: The author has kindly disavowed their error, their claim is now stricken through.

        Not sure about their new claim about GPU attacks though. The only thing I can say about that is that Argon2i uses 3 times less memory in practice (because 3 passes), and that could indeed make it a little more vulnerable. But if we’re talking strong password hashing as used for full disk encryption this is still a crapton of memory (at least 300MB), so I’m not sure it matters that much.

        Personally I prefer to talk about the better, more theoretical attacks, which give us a better idea of what we’re up against in the long term (or with state-level attackers, which is very much the case with our anarchist friend). For the RFC (published in 2021/07), the best reported attacks when using 1GB of memory give attackers the following advantages (smaller is better):

        • Argon2d: 1.33
        • Argon2i (1 pass): 5
        • Argon2i (3 passes): 3
        • Argon2id (1 pass): 2.1
        • Argon2id (2+ passes): 1.33

        For constant time defenders (they can spend a fixed amount of time on each hash), the strongest options are:

        • 1 pass Argon2d: 1.33
        • 1 pass Argon2id: 2.1
        • 3 pass Argon2i: 3 (likely less)

        Now what if an attacker can mount a magical timing attack that reveals all secret-dependent access patterns, I think we get the following:

        • 1 pass Argon2d: ~infinite
        • 1 pass Argon2id: 10
        • 3 pass Argon2i: 3

        Argon2d reduces into a fast hash, which destroys its purpose as a password hash. That’s why it was never recommended for regular password hashing where timing attacks are a concern (like a PC where untrusted programs may be running). Argon2id gets its initial advantage multiplied by 5 (ouch). Argon2i is unaffected, and now the winner. Still, in relative terms the differences aren’t that big:

        • Without side channels, Argon2id wins by a factor of 1.4.
        • With side channels, Argon2i wins by a factor of 3.3.

        If side channels are a concern but not a certainty I’d be hard pressed to determine which is the better candidate.

        One caveat: I may have painted Argon2i in a better light than is warranted: because it uses 3 passes it also uses 3 times less memory, and that makes it weaker in practice. I expect however that the effects are even subtler than what I’ve just outlined.

    2. 5

      Let’s do the back-of-the-napkin math. The password was longer than 20 characters. So, as a worst-case analysis, suppose the password length was 21 characters. And suppose he only used the English alphabet, a-zA-Z, so that’s 52 symbols (case sensitive). Let’s ignore punctuation and numbers. That’s 1.08*10^36 possibilities, an enormous number. If we take the binary logarithm of this number, we get about 119.7 bits. That is hard to brute-force. Like, we are talking rooms and rooms full of better than state of the art machines cracking away at this problem for weeks or even months. Now, of course you could say that it’s a reduction in complexity if he chose a password like CorrectHorseBatteryStaple (and so on), because then you just need to try dictionary words. But he mentioned that he used numbers and punctuation, so I guess that would at least partially make up for it. Either way, at this length, that’s a tough nut to crack.

      So either:

      • The password leaked somehow. He wrote it down, he had it saved on an unencrypted machine, he had a government trojan, etc.
      • He chose a weak, predictable password containing familiar words, names, dates of birth, etc.

      These options are much more likely than the government magically being able to crack almost 120 bits of key material.

      1. 2

        Now, of course you could say that it’s a reduction in complexity if he chose a password like CorrectHorseBatteryStaple (and so on), because then you just need to try dictionary words.

        Well, yeah. I think that’s clear. Anyone who’s using a 20+ character password is obviously not choosing it at random.

        he mentioned that he used numbers and punctuation, so I guess that would at least partially make up for it.

        These can be predictable too. In the US you’re more likely to find the use of ! and ? used, typically towards the end of passwords. In some countries it’s more common to see *. Any non-random password is going to be biased by patterns.

        Smart crackers take advantage of this (hence why I even know about these patterns, having spoken to professional crackers).

        He chose a weak, predictable password containing familiar words, names, dates of birth, etc.

        I guess it depends on what “weak” and “predictable” are. We don’t know how much compute time was spent on this. All we know is that it was practical for a government attacker.

        1. 8

          Anyone who’s using a 20+ character password is obviously not choosing it at random

          I have to disappoint you here. I do.

          1. 2

            Well, I hope you can at least understand that that is extremely uncommon.

    3. 4

      The obvious fix would be to upgrade the key derivation function on the first boot after upgrading LUKS. But that’s so obvious that I feel there must be something deeply wrong about it. I wonder what.

      Perhaps that if it’s a removable device, the user may want to retain the old bad function for compatibility with older devices… but if that’s it, then it’s a small tail wagging a big dog.

      1. 3

        Seems like you could come up with a scheme that prompts for the upgrade on boot right after entering your passphrase, but with a timeout defaulting to no. It would at least raise awareness, and could be done without booting from external media for a LUKS 1->2 upgrade.

        1. 2

          yeah this is one of those ecosystem health things that probably just hasn’t happened because nobody is taking ownership of it

          this one at least falls clearly into the Linux distro’s area of responsibility, although it would be helpful for the LUKS people to issue a recommendation or something

          in general though LUKS tends to navigate conflicts among their highly opinionated userbase by pushing every important choice onto the end user. this is what a lot of older free software projects have traditionally done, so it’s understandable, and certainly I prefer it over turning everything into a tightly integrated single vision that can’t be customized, which is the contrasting approach.

          anyway! you’ve inspired me to think about what can be done to improve this situation in NixOS, since that’s the distro I’m involved with. every distro handles this sort of thing in its own way; on Nix there’s a bunch of shell scripts that run in the stage1 boot environment for LUKS stuff.

        2. 2

          You should be able to add a key slot with new KDF, verify it and then erase the old slot on the fly. So yeah, your proposal makes a lot of sense. Not so much for a LUKS 1->2 upgrade, but that still could happen within the initrd.

          1. 1

            I think that’s the best approach. I haven’t tried it so far however.

    4. [Comment removed by author]