1. 12
    1. 7

      This article makes a common mistake: the American term PII has a much more restricted meaning than the European term personal data, so it’s a warning sign to mix them up. I think this article might be working in the intersection, but at one point it worries about data that is “personally identifying” whereas the relevant test for European privacy law (since decades before the GDPR – I was taught this in university) is whether the data can be linked to an individual.

      But regardless of the law, how good is this setup at deleting data? The plaintext commitment uses HMAC-SHA512 to identify the plaintext without revealing it, and Argon2id to make it hard to guess by brute force. But if someone knows the plaintext (it used to be public) they can still prove after its encryption key was shredded that it was recorded in the public key directory. I dunno how much of a problem that might be.

      1. 2

        There’s no clear legal precedent for this, so we’re left to speculate.

        I don’t think the correct test is “it used to be public”. Google search results used to be public too.

        I think it’s, “Can you, with no knowledge of the plaintext or key, deduce it?” The HMAC isn’t unique enough for a definitive confirmation. You’ll need the full Argon2 to match too.

        Whether this works or not (or whether this is “better” in a world where courts rule that public keys are personal data), I can’t say. I don’t think the author can, either.

        1. 3

          Note that (as I said) I’m not trying to determine what the law says about this particular design. I’m just identifying an issue that was not mentioned in the article. If someone is asking for their data to be deleted then you should make any caveats clear, on the straightforward basis of being honest with your users.

          1. 1

            If someone is asking for their data to be deleted then you should make any caveats clear, on the straightforward basis of being honest with your users.

            Agreed 100%