1. 53

  2. 13

    Like I recommended for Mozilla, they could use this money to acquire and/or create highly-usable alternatives to many products that are about handling communications or data in trustworthy way. Things like SpiderOak, VPN’s, backup apps for iOS/Android, HSM’s, payment services, paid email… anything that unscrupulous businesses have been a problem for with insecure solutions or them just cheating customers. Each one is turned into a commercial product either shared source or dual-licensed GPL/AGPL. The money coming in improves each both for new features and 3rd party review. Release code as GPL either on component level as they are built or as a whole after development cost was paid for. Many businesses will still pay for GPL-licensed code just to know someone is responsible for it.

    Such models will gradually increase the number of trustworthy goods available through trustworthy suppliers over time. That’s the basic concept anyway.

    1. 7

      with this extra $50 million they’ll surely have the resources to support federating their servers. let’s see it moxie.

      1. 3

        I hope so too, but Moxie Marlinspike voiced quite principal concerns against federations before https://signal.org/blog/the-ecosystem-is-moving/

        1. 5

          That was the day I realised I had to boycott Signal too

          1. 2

            Generally I don’t like “me too” posts, but in this case, me too! This is unacceptable. Using phone numbers as sole userids, is also unacceptable in my book.

            1. 1

              What is wrong with using phone numbers as ids? Was it wrong 50 years ago?

              1. 4

                First, I don’t want to give my phone number to strangers. I am okay with giving my e-mail address (or some other kind of token) to strangers.

                Second, at least for myself, e-mail addresses are eternal, while phone numbers are very ephemeral. Especially if you travel or move a lot.

                Third, Signal doesn’t just depend on your phone number, it somehow depends on your SIM card (not sure of tech details). You can’t change your SIM card and continue to use Signal smoothly. For me this is a blocker. It means I can’t use Signal even for testing purposes, as I switch SIM cards often.

                Apple iMessage gets this right. You can have any number of ids, including phone numbers or e-mails. I am identified by either one of those. I can be contacted by people who have either in their address book. And I can switch my SIM card any time I want. Of course, iMessage is not equivalent to Signal, nor is iMessage a good example to follow apart from the UX.

                Also I must add a fourth point about Signal. Until relatively recently there was no way to use it on a real computer. Now there’s an Electron application, which to me still means there is no way to use it on a real computer. I do not know if 3rd parties can implement real native desktop applications or not, but there are no such applications today.

                1. 3

                  Third, Signal doesn’t just depend on your phone number, it somehow depends on your SIM card (not sure of tech details). You can’t change your SIM card and continue to use Signal smoothly. For me this is a blocker. It means I can’t use Signal even for testing purposes, as I switch SIM cards often.

                  I have a burner phone that was initially set up with a throw-away prepaid SIM. After doing the initial setup (including with Signal), I threw away the SIM and put the phone in airplane mode. The phone now sits behind a fully Tor-ified wireless network. Signal’s still working fine.

                  Maybe if I were to put in a new SIM card, Signal might go crazy.

                  And since this is a burner phone that sits behind Tor with a number that’s meant to be public, here it is: +1 443-546-8752. :)

                  1. 2

                    I have a burner phone

                    You can’t legally acquire a pre-paid SIM in the European Union without registering it against your ID. They did it to ‘thwart terrorism’.

                    1. 1

                      Interesting. They give out pre-paid SIMs as promotions on the street here in Sweden, or at least they used to. Maybe the ID check comes at the first top-up.

                      1. 1

                        In the past you were able to obtain them anonymously.

                        They still give them away like candy but it won’t operate unless you register it by providing your ID at the operator. Though I’m speaking based on Poland - don’t know how other countries regulated this.

                        1. 1

                          I see. I don’t know if it’s a specific EU-related law / regulation or whether each country has their own rules.

                          1. 2

                            Some EU countries have regulations limiting the possibility to purchase prepaid cards to the stationary shops of telecommunications operators. Such solutions have been adopted i.a. in Germany, United Kingdom, Spain, Bulgaria and Hungary. Obligation to collect data concerning subscribers who use telecommunications services can be found i.a. in the German law.

                            source: http://krakowexpats.pl/utilities/mandatory-registration-of-prepaid-sim-cards/

                            Funny I thought it was a cross EU law. Regardless, that still makes it very annoying that signal has no other means of making an ID. I don’t really want to give my mobile to everyone, and there is no way to use signal anonymously in countries that do regulate sim registration.

                      2. 1

                        What that would do is create a black market for pre-paid SIMs, where you have a single entity registering tons of SIMs and reselling them pre-activated.

                        1. 2

                          That is what is happening on the street, criminals approach durnkards etc. to register a SIM on them and resell or use that themselves.

                          Point is, for a regular person there is no legal way to obtain an anonymous SIM. Creating a legal entity registering SIMs is also not possible. This means that signal can’t be used anonymously if you want to stay on the legal side.

                          1. 2

                            Completely agreed. It’s unfortunate to see such silly laws that are so easy to be skirted around. All it does is make people who would otherwise be honest and trustworthy break the law.

                      3. 1

                        At least in my country, phone numbers can, and eventually, will be reallocated when not in use for several years. So aren’t you running at a small risk that someone else might register ‘your’ number with Signal in a few years?

                      4. 1

                        I cornered a Signal dev at a networking event in December and emphasized how much I also want to communicate via Signal without giving out my phone number. They were aware of how much demand there is for that feature but politely declined to make a public commitment - at the time they were maybe 8 devs and their productivity is limited. Hopefully they’ll be able to expand and address the feature.

                        1. 1

                          A me-too-style reply for what @lattera said.

                          A friend lives abroad and got a local SIM on a visit once. When he went back, he discarded his SIM, unknown to me.

                          When I heard he might be visiting again, I sent a Signal message asking if this still works. To our surprise, it did.

                          So this myth needs to be busted.

                          It may have a bug, though, as I sent a Signal message to another friend and got a reply from a foreign phone number. He told me it’s the number of a SIM he used on a business trip.

                          That’s a different issue someone else can hunt down, but Signal is more anonymous than eg. Bitcoin as it stands today.

                  2. 1

                    this is so dumb

                    • it’s a messaging app… how much can people’s expectations evolve? how have they evolved since signal’s inception?
                    • the cost of switching between services is low only for services that already have mass adoption. if moxie started fucking around with the protocol and people weren’t having it, network effects mean there would be no alternative (whatsapp and facebook messenger are not alternatives)
                2. 2

                  So that’s a 50 million donation from a co-founcter of WhatsApp? Neat.

                  1. 2

                    I’m gonna be lazy and repost my comment from HN:

                    This is a bit of a tangent, but I first heard of Intel SGX (Secure Guard Extensions) via Signal’s blog post about secure contact sharing, so it’s almost relevant 😛 From what I’ve read [1][2][3], Intel SGX is vulnerable to Spectre exploits. Does anyone know if this has changed Signal’s approach to security at all? Granted, contact sharing was a technology preview, but I’m curious if SGX is still considered a feasible, “good enough” security measure for Signal Foundation.

                    1. 3

                      You could say the same thing about it running on smartphones or desktops in general instead of security-focused appliances. It intentionally takes on large risks of attack to create similarly large increases in usability and integration with popular systems with goal of improving the baseline. It comes with the territory. In that one, they just gotta deal with each vulnerability or weakness as they come up since they definitely will a lot more on platforms not designed for security.

                      SGX would seem to be an example. The CompSci literature already didn’t trust it much when it came out with the herd mostly making replacements or alternatives for it. I don’t trust it. Even if it was a good model, Intel’s hardware abstractions are so leaky due to high complexity. I don’t think that will be fixed in the future in high-performance or cheap chips.

                      1. 3

                        The alternative to using SGX attestation for this is doing nothing, so to remain viable it only needs to be better than pinky-promising not to retain contact data on the server.

                        1. 1

                          There’s a lot of alternative techs that have been built for attestation, tamper-resistance, or minimal TCB for specific code. Most predate SGX. If we include ARM-based, there’s too many developing for me to even follow. There’s reason they might choose SGX over them but there’s definitely others.

                          1. 2

                            In terms of real, practical solutions that exist and can be deployed into production right now? Keeping in mind no one is going to replace their Xeons with anything else.

                            Sure there are plenty of other impractical solutions, such as requiring downloading a multi-hundred gigabyte encrypted bloom filter. They asked several years ago for ideas on how to solve this and SGX was apparently the result. Hardware attestation isn’t a great solution in general but it is better than the status quo of just doing nothing.

                            1. 1

                              how does that bloom filter work?

                              1. 3

                                https://signal.org/blog/contact-discovery/ is the post @jabberwock referred to, which goes through all the possibilities

                              2. 1

                                “In terms of real, practical solutions that exist and can be deployed into production right now?”

                                I’d trust none if the requirement is that the attacker won’t see what’s in RAM for both physical and software attackers. Intel CPU’s aren’t designed in the threat model of good HSM’s. SGX-based schemes have been broken or bypassed more than once like everything else they added to their complex, backward-compatible ISA. With such caveats, it offered marginal benefits over low-TCB predecessors like Flicker or DMA protecting separation kernels on embedded boards. It’s a security obfuscation against some of its threat model rather than a real blocker to all of them.

                                I’d just assume I had to trust whoever set up or maintained the boxes. It’s what I have to trust using mobile OS’s already since most attacks on my Signal app will come from below (platform providers) or the side (other apps). My trust model remains: having to trust multiple hardware providers, all privileged firmware, software developers on platform side (esp privileged code), OWS, and 3rd party apps if side channels are a concern. And, even if I trust Moxie/OWS or SGX, most of these others are careless or malicious a lot of the time. It’s why my ultimate recommendation for smartphones is to consider them as untrustworthy, broadcasting devices as much as possible with contacts having to stay private not in any service.

                                So, at first glance, it’s nice he uses it as an extra layer of protection but the platform-level tradeoffs negate its benefits with their attack surface. He can use it, use something else with low TCB or odds of hacking… doesn’t matter to me until I have client-side mitigation of 0-days in OS and firmware code with confidentiality of storage at the least.

                                “ Keeping in mind no one is going to replace their Xeons with anything else.”

                                There’s significant market for ARM-based servers, including in cloud. The ARM ISA also dominates most computing devices in tablet, mobile, and Linux/Android embedded. That billions are being poured into ARM-based solutions doesn’t mean I’m saying Xeon is avoidable for a specific, use case. Just that market numbers show it’s often not the only or even preferable option.

                                And if you don’t use Xeons, you have various kinds of HSM’s (which Apple used) to consider with strong tamper-resistance or even EMSEC safes (can have EE’s do faraday cage for cheaper safe) for ordinary boxes to consider. The later with 3rd party checking can reduce physical and TEMPEST risks. If SGX is just obfuscation, then considering other options for obfuscation from them all the way to unusual hardware open up more opportunities.

                                Although I don’t expect this stuff initially, leveraging some of the techniques for hardware or EMSEC makes a bit more sense after a hugely-profitable acquisition. ;) Alternatively, funding projects doing any of this with lower, unit prices resulting. That could’ve been a 1-2U rackmount installed at multiple sites in clustered configuration. At similar or lower development cost, they might have addressed non-availability of TEMPEST-certified servers by converted an existing computer safe into one with a faraday cage plus power filters. Each subsequent mod kit basically is the unit cost plus some profit. If using a cluster, maintenance protocol can have computers switched off after secrets are nuked so even opening cage doesn’t leak secrets. One can go further with a safe-ish, thermite setup like Skunkworks’ if they’re inclined but I’m keeping the recommendations more reasonable.

                                Lots of possibilities both for obfuscations that trust OWS and stronger security that doesn’t. More possibilities the more money and time you have. More possible attacks the more money and time they have. Fun field, aint it? ;)