Yes, Apple did build a very secure high value vault, which evrybody said was impossible HOWEVER, they were only able to do so by removing the other thing everybody said makes it impossible to build - the master key.
The system is secure because nobody has admin access. Admin access has been disabled because there is no way for a computer to verify the intent behind admin access.
If your enable admin access, the system is no longer secure. They built a working implementation of the argument evrybody has been making.
Between this and the secure boot bypass, there’s a lot of misinformation running around. Everything is a golden key or a backdoor or whatever, even when it isn’t.
The key to the next generation of really gnarly shit on the net, both good and bad, lies in these autonomous systems whose creators have either a) revoked their own access or b) caused to be created in a way that they themselves have no knowledge of.
The obvious legal move is to outlaw the knowing creation of such systems, but that is so unnervingly close to common practice that all of us have it in our best interests to keep a damned close eye on the legislation.
There is a lot of futurist fapping about AGI and singularities and other nonsense, but autonomous systems like these (lets call them gremlins) are both straightforward to build and useful to build. Imagine a torrent site that cannot be taken down because it handles its own hosting.
All this to say–we’re about to go from the topic being very abstract and navel-gazy to something that impacts all of our practical work, and it’ll be happening within, say, five years' time.
It’s important to distinguish between “can’t be taken down” and “can’t be destroyed”. It’s not like Apple built SkyNet. They could shut down those servers any time they want (or were forced to). It’s just that the consequences would be the loss of all the user keychains, which is a high bar.
I don’t know how this maps to a site that “handles its own hosting”. I don’t even know what that really means. Is anyone developing a method to run a service such that literally no human knows how to deprive it of CPU cycles and networking?
BTW, I’m distinguishing that from a distributed system with enough humans willing to run a piece of it that it’s impractical to track them all down.
I imagine that as automation increases, it would be possible to fully automate paying for and setting up hosting, and then deploying the site to it, which means it could all be done with a script. If the site monitors its own availability and maintains redundant hosting across jurisdictions (maybe keeping places in reserve quietly for later deployment), then it can get itself redeployed in a new place when a copy goes down. Something like that. There are still weak links in this setup of course (e.g. where the funds come from) but it’s possible to come up with some sort of decentralised solution for that as well I suppose.
I see, so it would be a worm, but a “legitimate” worm in that it has some way to actually pay for its hosting, rather than the existing sort that breaks into systems to run itself (I guess we’d need to start calling that kind a “parasitic” worm to distinguish it).
Handwaving frantically, if hosting companies with uniform enough APIs and paid in bitcoin were common, and the service was accessed through .onion or some other non-DNS method…hmm…yeah. [edit: Or heck, what if it just knew how to buy stolen credit card numbers from hacker forums?]
So, just like a parasitic worm, to stop one of these things you’d have to look for known signatures, behavior patterns, etc. just like malware prevention. Perhaps the authorities would issue a “wanted notice” and require hosting providers to implement anti-rogue-application filtering.
I imagine that as automation increases, it would be possible to fully automate paying for and setting up hosting, and then deploying the site to it, which means it could all be done with a script.
That is exactly what to be thinking about. :)
Check out https://en.wikipedia.org/wiki/Decentralized_autonomous_organization and “StoreJ”, which is different than “the DAO” from the recent Etherium… news.
It is kind of far-out and weird.
Wow, this is really cool. My one concern is that if these systems are unauditable, there isn’t really any guarantee that the authors truly don’t have access to it. Furthermore, even code audits are worthless if you don’t know that the code that has been audited is the actual code running on the server.
I don’t doubt that Apple and other people have considered this, though. And it’s a really cool idea.
Are there any existing publicly-available software / services that do something similar?
Furthermore, even code audits are worthless if you don’t know that the code that has been audited is the actual code running on the server.
That is the problem with closed source software, and a server that you don’t have admin access to as a user basically counts as closed source software, even if everyone at Apple said, “It’s just LAMP with this code, here is the source”, we can’t trust any them (Or any large organisation) that that is the actual configuration.
The best we could hope for is an independent audit from someone like the EFF, but if there is more than one server or location this gets impossible really quickly, and they could upload a new version from another repo as soon as they leave. So I think any evaluation is essentially worthless and the only things we have to go on is trust and reputation (ie. don’t trust Sony, linkedin or ashley madison).
Is there a video or at least slides of the mentioned talk?
A link to the slides is provided in the first paragraph of the article. https://www.blackhat.com/docs/us-16/materials/us-16-Krstic.pdf