1. 11
  1. 47

    Look, I have no love for Microsoft and I take security very seriously but this is overblown, self-aggrandizing nonsense.

    A JWT is an industry-standard form of bearer token, and they are not treated as any more secure than a browser session. What this and the original article elides is that JWTs are generated with very short TTLs, and must be refreshed at regular intervals. The refresh token usually has a longer TTL and should probably be encrypted at rest, but any org as sophisticated as MS should be using adaptive systems that force reauthentication if anything suspicious happens.

    It’s also worth noting that the point about Electron not having crypto capabilities is complete nonsense. The embedded Chromium has SubtleCrypto, and the NodeJS system can use libnacl. This was a hatchet job.

    Bottom line: this requires someone to root your machine and be able to inspect your running software. You’re completely boned if that’s the case, and your Teams session is the least of your worries.

    1. 10

      Signal Desktop similarly stores its auth token in plaintext: https://www.bleepingcomputer.com/news/security/signal-desktop-leaves-message-decryption-key-in-plain-sight/

      The response from Signal was:

      The core premise of the article is completely mistaken. The database key was never intended to be a secret. At-rest encryption is not something that Signal Desktop is currently trying to provide or has ever claimed to provide. Full-disk encryption can be enabled at the OS level on most desktop platforms.

      1. 2

        Yikes. Disk encryption covers the “dude swiped my laptop” attack vector but not the malicious npm package (or whatever) attack vector. Isn’t this terrifyingly short-sighted of Signal?

        1. 8

          What would you propose as a fix for the problem? Whatever you can come up: As long as the key is stored somewhere, it’s available for malware to get it. Is it in the OS keychain? Inject code into the signal binary and query for it. Is it on disk? Read it from there. Is it encrypted on disk? Take the key from the signal binary and decrypt it. Is it in memory of the signal app? Take it from there.

          Whatever you can come up can possibly be classified as “defense in depth”, but there’s nothing (short of having to manually authenticate whenever the app does any network request / access to stored data) that can be done to protect secrets in light of malware attacks.

          1. 3

            I don’t know about windows and Linux, but on macOS keychain material is encrypted by the SEP, and access to the data requires user authentication and if set correctly, requires it on a case by case basis.

            By requires I mean it is not possible for any software at any privilege level to bypass it.

            1. 2

              I understand that there doesn’t exist perfect security in the face of arbitrary malware but we have OS key stores for good reason.

              If I told someone it would be extremely trivial to write a malicious npm package that stole all of their Signal messages, most people would be very surprised and some would perhaps be less likely to use Signal Desktop for very sensitive conversations. (There is no analogous attack on iOS, right?)

          2. 0

            Welcome to the dumpster fire that is Electron

            1. 9

              This isn’t related to Electron at all.

              1. 3

                yeah, especially given electron provides what looks to be a fairly trivial api for secure storage.

                There a lots of things that make electron apps a bad experience, but this is not one of them.

                1. 1

                  While I agree that Electron isn’t to blame, I will say in my experience Electron apps for networked applications rarely seem to use a proper secure storage system.

                  For accessibility purposes I use a hacked together terminal Slack client for most of my Slack usage. Originally I followed the advice of most 3rd party Slack clients on how to get a better token to use with 3rd party clients, but realized why bother when I can just write something that constantly scrapes various IndexedDB, Session/Local Storage databases.

                  I have a script that runs finds my Slack workspaces’ tokens, validates them, then shoves them into a secret store(org.freedesktop.secrets) and sends my slack client a signal to reload the secrets from the secret store. I do run the client for audio calls frequently enough that my local creds stay refreshed.

                  I’ve lost track of how many networked electron apps that I’ve encountered that I’ve been able to abuse the local storage being unencrypted to gain api credentials for scripting purposes.

                  This seems to be a side-effect of how many of these apps are fairly simple wrappers around their web versions and they don’t do the due-diligence on securing that data as they are used to browsers being in charge of protecting their data.

                  1. 1

                    While I agree that Electron isn’t to blame, I will say in my experience Electron apps for networked applications rarely seem to use a proper secure storage system.

                    Yeah, I’d agree here. But I feel that a lot of electron apps half ass pretty much anything that isn’t absolutely core to the app.

                    This seems to be a side-effect of how many of these apps are fairly simple wrappers around their web versions and they don’t do the due-diligence on securing that data as they are used to browsers being in charge of protecting their data.

                    Yeah, many seem like low effort “we made an App!” that is just a multi-hundred meg wrapper around a web page, but without doing any of the work an actual browser does (even chrome) to protect user data and privacy

          3. 5

            given that you don’t want your users to have to reauthenticate on each app startup, what would be a solution that provides safety in presence of malware running on the machine?

            1. 4

              I’m having flashbacks of this issue happening with browser password managers (people had the same hesitation). In general you may want to have a process akin to gnome-keyring to manage e.g., a decryption key for this sensitive content.

              Now that can turn into a big back and forth around threat models, false sense of security and more. There are articles surrounding this I think dating back to 2015. There are also tools about how circumvent this: https://ohyicong.medium.com/how-to-hack-chrome-password-with-python-1bedc167be3d however I think it varies by platform (it appears that in Windows it just stores the key somewhere, while in other OSs it does broker through e.g., gnome-keyring)

              1. 2

                so in the end, you’re talking about some defense-in-depth best practice, but this is by no means a “glaring issue” or if it is, then it is one that every single application running on a desktop is prone to.

                In my opinion, there is nothing an application can do to protect credentials in light of malware running on the same machine.

              2. 1

                Pretty sure macOS has something like iOS’s Keychain which makes it harder for rando apps to get your data. Idea is it is an enclave that you can use to store data. You’d need to exploit the app with the key in order to get it.

                1. 1

                  See my follow-up question to a sibling post of yours: how is the keychain protected against malware injecting itself into the target binary and then just querying the keychain?

                  1. 2

                    macOS Ventura will verify an application’s signature on every launch rather than only when the quarantained bit is set.

                    However, ideally, an application wouldn’t just use a token, but ask the Secure Enclave to sign something as a proof that the login is coming from an authorized machine, since the private signing key cannot be queried from software.

                    1. 1

                      I’m not talking about altering the binary on disk. I’m talking about injecting malicious code into the JIT compiled output that the electron app has written into memory. That can’t be signature-checked and yet is still allowed to run. That malicious bit of code can then do whatever the application could do, including having something signed for it.

                      1. 1

                        IIRC macOS has some restrictions on attaching to processes for debugging beyond the standard Unix “anything running as the same user is fair game” policy, so I wouldn’t be surprised if something prevents this; but I couldn’t find any details with a quick Google so I’m not at all sure.

                2. 1

                  Keyrings are what you’d usually use. There’s implementations on all major platforms and they’re tied to the system’s login.

                  1. 1

                    How is an OS level keychain protected from malware injecting its code into the target binary‘s memory? AFIK we’re not quite there yet code-signature wise to detect something like this, especially in apps which normally do JIT compilation (which would be true for an electron app)

                    1. 1

                      Yeah, not much is going to protect against malware injection, but storing secrets in plaintext is a way different problem. Encryption at rest is a must for secrets, even if it’s at the entire filesystem level, but a keyring is a good medium solution.

                  2. 1

                    Interestingly, the modern .NET MAUI API for this problem lets you get and set secure keys without relying on any “master” secrets, deferring to Keychain on Apple devices. It doesn’t go into vast detail about how it works on Windows and I would love to know what it’s actually promising in terms of key storage (TPMs?) and inter-app isolation, given that this is a hard problem.

                  3. 3

                    I think it’s funnier that MS outlook requires OAuth now, and email clients should be registered with MS on Azure, and then secrets are handed out and embedding them in client programs is encouraged, so now everyone just uses Thunderbird’s client ID and secret.