tl;dr: they don’t like it. I’m not surprised as this is seriously throwing away foundations of the web as we know it.
That said there are some interesting things one could distill from it. For example content signing. Web Packaging reuses X.509 certs to sign content so compared to signing packages with OpenPGP X.509 benefits from Certificate Transparency (you see when a new signing cert has been issued) and is bound to the domain name (in OpenPGP anyone can create User ID with any domain).
That super strange mozilla used a googledoc instead of a blogpost for doing such a punmication
Maybe they want google to read it.
Mozilla uses Google Apps internally; my guess is that this started as an internal document and they said “well we can just reuse it to share publicly”
Other than the title of the paper, how do we know this is actually from Mozilla?
https://mozilla.github.io/standards-positions/ agrees with it.
Quick link: https://github.com/mozilla/standards-positions/issues/29#issuecomment-495122302
Thanks. I’m not really in the habit of trusting random google docs, and wasn’t able to find anything like this.
Similar, internal position/planning/architecture documents tend to be written and distributed this way. This one happens to have been made public.
What exactly is the elevator pitch for web packaging?
Google wants it because it would let them serve AMP pages from their cache while displaying the original site’s URL in the address bar. It generalizes so that you could use it to serve stuff through a CDN with TLS without having to give the CDN your certificate’s private key. If I understand correctly.
So, it lets people completely misrepresent where content comes from? That seems problematic.
Which is why Mozilla opposes it.
Honestly I just read the abstract in Mozilla’s position document. I bailed pretty fast because it seemed pretty clear to me that web packaging is just an attempt from Google to “standards-wash” AMP’s technology.
The CDN case doesn’t work. As soon as you set your DNS to point to CDN’s servers, the CDN can obtain legit TLS certs itself. If you don’t point your DNS to the CDN, then it’s unclear how you’d get your users to discover and connect to the CDN using some other URL.
The archival/offline sharing case is also weak, because for security the packages have to have be opt-in and a short lifetime (no more than 7 days). So you can’t make archive.org from it, and on sneakernet it’ll all expire before it gets far. The whole tech is nothing but a big bugfix for AMP.