The authors of the Signal messenger about federated protocols.
It seems like this came about after things like this. (Aside: holy shit, the sense of entitlement on that thread boggles my mind.)
hahaha. If you don’t let us use your servers, fewer people will use your servers! Take that, moxie! And spend more of your time doing things properly, the way I want.
moxie on #37 (comment), I expected you to welcome LibreSignal on OWS servers because you have nothing to lose but a lot to gain. What is at stake here is the opinion and support of a big community, something that can make the difference in the future Signal.
A big community! Literally dozens of us!
Nothing at all compared to the “1,000,000 - 5,000,000” downloads of the official Signal app on Google Play.
I’ve heard something similar from designer friends who’ve been told “we won’t pay you with money, we’ll pay you with exposure — it’ll be great!”
My favorite response is, “People die of exposure.”
I don’t think arguing about numbers is the right approach here. That there are few people using LibreSignal does not make interoperability a good goal.
There is an angle to this that strikes me as odd when considering that this is the problem space of secure communications: if the client is open source for the sake of being able to verify that communication is end-to-end secure, Moxie’s stance that they want the ability to update the compiled client for all users of their servers has a smell to it. Asking users to trust that a binary APK matches the sources they provide is a big stretch when state actors could easily manipulate several parts of the source-to-binary pipeline. If Signal’s goal is verifiably end-to-end secure communication, I would think that they would be encouraging folks to build from source, as a form of defence against the official client being comprimised.
Making a deterministic build is one thing, but how many of its users are actually going to go through the trouble to verify that? I wouldn’t and I care about security and privacy. Signal accomplishes much more good as a simple easy-to-use tool that the masses can use and get strong encryption. If you make it hard to load onto your device, nobody is going to use it aside from the dozen people in that GitHub issue. Or worse, a security problem is found and lots of people using it don’t get a timely update because they all installed it themselves and don’t closely follow the changes made to the Git repo.
Heh. That goes a fair way towards explaining why deterministic builds are hard to make useful. :(
This is Tocqueville applied to open source: “That is why the desire for equality always becomes more insatiable as equality is greater”.
Making Signal open source is not enough; now they want to constrain Moxie in doing something he doesn’t want to do. As predicted by Tocqueville, this is the moment when the desire for equality gets in the way of liberty.
The sense of entitlement on LibreSignal' side?
Yes. Moxie makes it clear they can use the code and run their own server, but not call it Signal to avoid confusing people and adding more work for the Signal developers. I think that’s very fair, and as it came from the main developer of the app, the people responsible for this “fork” should respect his wishes. Instead there are childish comments like:
Seriously, if OWS continues that way, even libreSignal will be dropped from my devices, and no other apps from this organization will ever come back.
And generally just a ton of pushback on Moxie demanding that Signal’s servers take on the load of these modified clients, that he change Signal to support federation, and that he integrate their hacks to work without Google frameworks. All of which Moxie has answered before and has said they are not interested in doing.
If this was a closed-source application, none of this would even be an issue and Signal would still probably have as many users as it does. Instead, because Moxie and OWS were kind enough to open source it, you have all of these leeches coming to the surface demanding changes and releasing misleading forks of Signal with all kinds of hacks and possible security problems. As if somehow making a project open source requires that it become a “community” project and the original developers have no say in its direction anymore.
Thanks for clarifying. I fully agree.
I’ve specifically wanted to avoid linking to that thread as loud, entitled minorities in open-source software are neither new nor particularly interesting at this point.
The only thing that might be interesting are the allegations by Nadim Kobeissi, though they already got debunked.
I feel like the author conflates federation/decentralization with multiple parties implementing a protocol. If WhisperSystems has control over all relevant implementations called “Signal” (enforced via trademark law?), the network could’ve been however decentralized without being stuck in the 90s.
Yep, my feeling as well. Various other bits hint at some disingenuousness, this paragraph in particular:
I thought about it. We got to the first production version of IP, and have been trying for the past 20 years to switch to a second production version of IP with limited success. We got to HTTP version 1.1 in 1997, and have been stuck there until now. Likewise, SMTP, IRC, DNS, XMPP, are all similarly frozen in time circa the late 1990s. To answer his question, that’s how far the internet got. It got to the late 90s.
Let’s see, IP’s “frozenness” is precisely what makes it possible for Signal to exist and function in the first place. It in no way holds Signal back from adding new bells and whistles.
Similarly, Signal still relies on DNS to function. It adds a layer of security on top of it by doing (effectively) TOFU in an easy-to-use manner. That’s where 99% of Signal’s security comes from, the other 1% is the asynchronous ratchet.
The whole post to me reads like one long kinda-whining rant about how it’s easy to make changes to centralized systems and difficult to modify decentralized systems. Well, duh. That doesn’t mean decentralized systems are outdated or “stuck in the 90s” or aren’t useful when it comes to secure messaging applications. Indeed, decentralized systems are what will make them even easier to use and more secure than they are now.
i have the feeling he has given up.
for me it still sucks that i have to use crappy unfederated services.
and it isn’t so hard to have federated stuff that works. the problem is that signal etc. has shitloads of money while most of the examples the author spoke of depend on the free time of people, when all that is needed is a decent clients with a “reference” featureset.
xmpp isn’t in this bad state because its federated, xmpp is in this bad state because it’s xml hell. how many working xmpp engines are there? two? thats because it sucks to implement it.
irc for example is moving forward, rather sucessfully if one can believe the software charts at http://ircv3.net/software/clients.html and http://ircv3.net/software/servers.html
and mail just works. maybe we just need otr for mail. because despite of being “a bit rocky” sometimes, otr works too, at least for me.
I think Moxie’s take is the result of having built and run an accessible secure communications network for several years. For some background, in addition to doing crypto and security stuff professionally, I’ve used a lot of technologies (from Lisp Machines and Smalltalk to plan9) that leave me feeling like what we have now is so far from what it should (or could) be that it feels fundamentally broken. And yet, the reality is this is the world we live in. We can try to work towards a better solution, but the reality is we have to work with what we have.
It’s hard when users want a single identity, i.e. their phone number. For a phone number to be a useful identifier, you really need a central server to keep track of them.
I run my own XMPP server for secure comms and I can talk to all of five or six people that I regularly talk to that way; because my setup requires end-to-end (aka server-to-server) crypto, I can’t talk to anyone using Google Talk or even the jabber.ccc.de server due to TLS issues. I haven’t found a good XMPP client for my iPhone or iPad that works reliably, and ChatSecure likes to eat my Nexus 5’s battery (and did the same to my previous Android devices). I also have an IRC server that’s set up securely and doesn’t leak user information — but IRC is pretty much a centralised service. Personally, I find it hard trying to manage a number of connections and remembering which channel is on which server, and that was just when I used four servers (mine, freenode, oftc, and another friend’s server). IRC is also hard for “normal people” to use — I have no problem using Prompt on my iPhone, but none of the people I talk to outside of tech would bother with that. They might be able to use IRCCloud — but that’s another centralised service.
Mail works, except we still don’t have reliable end-to-end encryption and (at least for me) most of my mail ends up going to people using Google’s servers.
I did some cursory looks at people in that thread, and many of them have no code that I could find (on Github or elsewhere) or the code they’ve worked on has nothing to do with building these types of systems. If building an accessible, federated, properly-done secure communications network is so easy, why haven’t any of the other projects done it? It seems to me that Signal picked up users because they built that network and people used it. Get some normal users (it helps to have a lot of people who aren’t involved in tech or have a low tech literacy) and prove that it’s viable.
I rather think that IRCv3 is a bad example of moving forward. IRC used to be a protocol that was simple to understand and implement. With the addition of various extensions, to successfully be fully integrated with everybody else, you now need to understand services, capabilities, the awkward additions each capability adds, have more state to manage, et cetera. While of course you could simply ignore the new additions, you’re then operating on a limited set of features compared to everybody else; you’ve lost compatibility with everybody else. IRC hasn’t moved forward, it’s limped forward with hastily nailed on additions that are awkward to think about, let alone implement. So instead we don’t worry about it and turn to more “centralized” services, such as IRCCloud, et cetera.
Mail doesn’t just work either. We have abused it by turning it into a file delivery system, adding on a variety of mechanisms that now need to be added to various implementations as well. We’ve added plentiful security mechanisms such as DKIM to protect against valid problems inherent in human societies (all those spammers…). We’ve even tacked on the entirety of HTML and CSS onto mail, because we want it to look like it was written on a papyrus. To most people, mail “just works” because it’s supplied by centralised services such as GMail, Yahoo!, AOL, et cetera. To those that have set up their own mail servers, how much software was necessary to be set up to be able to access your email from both your computer and your phone? How about being accepted as a valid email server by Google?
It all gives the illusion of working, because the more technology we pile up atop, the more it hides what happens down beneath. I’m not saying that this problem is inherent in federation, but it seems to be the rising trend: An illusion of backwards compatibility is given when in reality we are driven towards centralized services that have become rather popular. I see no differences between any of your examples — they’re all sucky to implement, and that leads us to more centralized services that proceed to slowly lock us in the more time goes on.
Compare that to xmpp and the xeps ;) I wasn’t saying that IRCv3 is perfect, but according to the client and server software lists seems to be quite successfull. I’m not sure about breaking compatibility with regards to capabilitys, they seem pretty optional for now. I just have the feeling that compared to xmpp, most of the software really implements these extensions.
To most people, mail “just works” because it’s supplied by centralised services such as GMail, Yahoo!, AOL, et cetera. To those that have set up their own mail servers, how much software was necessary to be set up to be able to access your email from both your computer and your phone? How about being accepted as a valid email server by Google?
Mail too, isn’t perfect of course. But we still have to see the “mail-killer” that is federated.
It all gives the illusion of working, because the more technology we pile up atop, the more it hides what happens down beneath. I’m not saying that this problem is inherent in federation, but it seems to be the rising trend: An illusion of backwards compatibility is given when in reality we are driven towards centralized services that have become rather popular.
“We” aren’t driven anywhere, it’s the 90% of the people who want to send an image of their lunch which, in turn, force the 10% to use centralized stuff because social foo.
I see no differences between any of your examples — they’re all sucky to implement, and that leads us to more centralized services that proceed to slowly lock us in the more time goes on.
They are more sucky to implement than talking to a HTTP-API, ok. I don’t get that trend either.
PS: in essence: everything we’ve build so far sucks, no matter how much we tell us it doesn’t.
I’m not sure about breaking compatibility with regards to capabilitys, they seem pretty optional for now.
Yeah, they are optional. However, if you want to do things “the right way”, you should use them. In certain situations, they end up being necessary, or at least identification with NickServ becomes necessary. As soon as you make something, people will start making things that require it.
It’s a perpetual loop! They use it, so we use it. But then we’re using it, so they use it. :(
Well, there are some things suck less… I hope…
(I’m sorry, I couldn’t resist the pun.)