Based on my understanding of the subject, key commitment is only necessary for those AE (Authenticated Encryption), or AEAD (Authenticated Encryption with Additional Data), algorithms that use Poly1305 or similar (like GCM).

However, if one uses for the authentication tag an HMAC (or something like Blake3), then this commitment is unnecessary. (Or so I understand at the moment…)

The reason – and this is something the article missed to introduce – is that for some AE / AEAD algorithms (like mentioned above for example based on Poly1305), there is an easy way to compute decryption keys that would generate a correct authentication tag, thus providing broken data from the decryption.

Thus the whole issue boils down to this: given a message (with encrypted data and authentication tag) and given a decryption key, we can’t be “sure” (in the cryptographic sense), that this particular key was actually used to encrypt this message.

You are right, that using something like HMAC (which is also a PRF suitable for the constructions mentioned in the post) would alleviate this problem. However, then you either have 1) a non-affected AE(AD) and you don’t worry about this or 2) you make a new Encrypt-then-MAC construction (or other AE) which doesn’t suffer from this.

Thanks for the feedback though! I probably should’ve added a small intro in terms of the problem. I sort of assumed that if someone were to use information from this post, they knew the problem and the fact that they needed to solve it.

So then, I’m asking in general out of curiosity as I’m not a cryptographer, why do we bother in general with constructs that have so many pitfalls?

If we already know that we can construct something like XChaCha-Blake3, why don’t we just create that, validate it, implement it in a few languages and push people towards it?

I understand that Blake3 might be overkill in terms of performance compared with Poly1305, however in 99% of the cases performance isn’t an issue, because 99% of people using these are non-cryptographers and most likely will just fall in those pitfalls…

One example is, if you can’t choose which AE/AEAD scheme you can use. In some contexts, such as .NET, you’re relatively restricted in what AE/AEADs you can choose (I think only AES-GCM and ChaCha20-Poly1305). With these, you’re not able to simply go ahead and modify the underlying implementation directly and here something that can be put on top is easier to deal with, instead of reimplementing.

Yes, you can make an AE work for Encrypt-then-MAC XChaCha20-BLAKE3, but for example, if you want to use this as an AEAD, then you also need a canonical way to add the ciphertext and AD to the MAC (as one of the comments at the bottom of the article you linked also mentions).

Based on my understanding of the subject, key commitment is only necessary for those AE (Authenticated Encryption), or AEAD (Authenticated Encryption with Additional Data), algorithms that use Poly1305 or similar (like GCM).

However, if one uses for the authentication tag an HMAC (or something like Blake3), then this commitment is unnecessary. (Or so I understand at the moment…)

The reason – and this is something the article missed to introduce – is that for some AE / AEAD algorithms (like mentioned above for example based on Poly1305), there is an easy way to compute decryption keys that would generate a correct authentication tag, thus providing broken data from the decryption.

Thus the whole issue boils down to this: given a message (with encrypted data and authentication tag) and given a decryption key, we can’t be “sure” (in the cryptographic sense), that this particular key was actually used to encrypt this message.

You are right, that using something like HMAC (which is also a PRF suitable for the constructions mentioned in the post) would alleviate this problem. However, then you either have 1) a non-affected AE(AD) and you don’t worry about this or 2) you make a new Encrypt-then-MAC construction (or other AE) which doesn’t suffer from this.

Thanks for the feedback though! I probably should’ve added a small intro in terms of the problem. I sort of assumed that if someone were to use information from this post, they knew the problem and the fact that they needed to solve it.

So then, I’m asking in general out of curiosity as I’m not a cryptographer, why do we bother in general with constructs that have so many pitfalls?

If we already know that we can construct something like XChaCha-Blake3, why don’t we just create that, validate it, implement it in a few languages and push people towards it?

I understand that Blake3 might be overkill in terms of performance compared with Poly1305, however in 99% of the cases performance isn’t an issue, because 99% of people using these are non-cryptographers and most likely will just fall in those pitfalls…

One example is, if you can’t choose which AE/AEAD scheme you can use. In some contexts, such as .NET, you’re relatively restricted in what AE/AEADs you can choose (I think only AES-GCM and ChaCha20-Poly1305). With these, you’re not able to simply go ahead and modify the underlying implementation directly and here something that can be put on top is easier to deal with, instead of reimplementing.

Yes, you can make an AE work for Encrypt-then-MAC XChaCha20-BLAKE3, but for example, if you want to use this as an AEAD, then you also need a canonical way to add the ciphertext and AD to the MAC (as one of the comments at the bottom of the article you linked also mentions).