1. 4

    more of passing thought that than a nuanced response, but I find it disheartening that pieces like this completely fail to take into account or even consider the underlying power structures (basically politics at various levels) that have bothed massively influenced what “cloud” we have today, where it can realistically go tomorrow, and what the implications of any predicited routes might be, (e.g is this further centralising control, how is the balance of power being changed between various cloud providers, device manufacturers, network owners, etc etc)

    1. 1

      It’s not so much conscious political decisions as blind corporate seeking of efficiencies.

      1. 1

        as maybe, but I would hope that even in the most boring way possible (i.e who is the big fish in my pond) at least some corporate entities are aware of the wider political implications of where cloud computing might go and what it might do to them. Just chasing efficiency by itself is not a viable long term stratergy

        1. 1

          Yes, but you can also substitute “externalities” for “efficiencies” most of the time.

      1. 12

        oh wow… a website decided to use a single 1.3mb JSON file (thats compresses to 187KB on the wire, and is only fetched on first use) rather than spin up entire webservice to for a single column lookup. God only knows what the internal processes around adding things to the public facing aspect of a consumer website are for barclays, but this there’s plenty of likely scenarios where what they’ve done is perfectly fine, avoids complexity and is a reasonable tradeoff.

        Just to drive this nail in further, at no point does the article attempt to flesh out what a sensible web service would be, or what it would entail in development and ongoing operational costs, or even consider who might asking for this functionality and what tools/teams they might have available to them

        Blog posts like these demonstrate the complete detachement from “getting something useful done” that afflicts fat oo many “celeb” developers

        1. 7

          Yeah, using a single JSON blob for this is totally appropriate. It means you can host the site on S3 or whatever. The author says “why not use a regex” but that’s a really fragile solution that assumes the numbers will follow a definite pattern. It sounds like they botched the client side lookup cleaning though. Oh well. There are plenty of other things to complain about in life.

          I wrote a site that tracks some data, and I realized that in ten years, we only had 4,000 entries which came out to 700KB of JSON (180KB with gzip), so I just ship the whole dataset down to the client. It’s a much better way to do it: no expensive DB queries, the JSON is always warm in the cache, subsequent data filtering on the client side is instant, etc.

          1. 6

            I get phone calls from banks and other financial services people periodically that start by asking me to prove who I am. I always reply by saying you called me, so please prove that you are from Barclays before I say anything else. I was pleasantly surprised by my most recent call from Barclays: They are the first company to call me and actually have a procedure for doing this. The people that are authorised to cold-call me now have access to a thing that can send me a message via the mobile banking app, so they could send me something saying ‘On the phone with {person name}’ to confirm that this person actually was supposed to be talking to me. It’s not completely secure. Anyone who can compromise the app can now impersonate Barclays, but in general someone pretending to be Barclays on the phone can do far less damage than someone who can compromise the app and is more likely to be detected, so it’s probably fine.

            1. 3

              Agree. To me this looks like something that had to be put in place hurriedly to counter a recent spate of cold calls from people pretending to be from Barclays. Knowing a little bit about how fast banks move (for both fiduciary and cultural reasons) this setup looks typical.

            1. 6

              In the exact words of Mitchell (vault cofounder) over on the orange hellscape https://news.ycombinator.com/item?id=23032499

              I’m one of the creators of Vault. I read this back when it was posted and I’d be happy to share my thoughts. I’ll note its worth reading through to the last paragraph and into the comments, the title is a bit bait-y and the article does a better job than the title gives itself credit for.

              Broadly speaking, if you’re looking at Vault to solve a specific problem X for a specific consumption type Y on a specific platform Z, then it probably is overkill (I wouldn’t say “overhyped” :)). i.e. “encrypted key/value via env vars on AWS Lambda”. “X via Y on Z.” The power of Vault is: multiple use cases, multiple consumption modes, and multiple platforms supported with a single consistent way to do access control, audit logging, operations, etc.

              I can’t stress that “single consistent way to do access control, audit logging, operations, etc.” enough. Multiple security use cases dangling off that consistency is really important as soon as you hit N=2 or N=3 security use cases.

              If you need say… encrypted KV and encryption-as-a-service and dynamic just-in-time credentials and a PKI system (certificate), and you need this as files and as env vars, and you need this on Kubernetes and maybe also on EC2, then Vault is – in my totally biased opinion – going to blow any other option out of the water.

              That’s a somewhat complex use case but its something Vault excels at. For simpler use cases, Vault is making more and more sense as we continue to make Vault easier to use. For example, we now provide a Helm chart and official K8S integration so you can run Vault on K8S very easily. And in this mode, developers don’t even need to know Vault is there cause their secrets show up as env vars and files just like normal K8S secrets would.

              Also, this article is from June 2019 and in 10 short months we’ve made a ton of progress on simplifying Vault so it gets closer to that “X via Y on Z” use case. Here are some highlights I can think of off the top of my head but there are definitely more, this is just from memory:

              • We have integrated storage as an option now, so you don’t need separate storage mechanisms.

              • Our learn guides went from basically zero to lots of content which makes it much easier to learn how to use Vault: https://learn.hashicorp.com/vault

              • We have an official, feature-packed Kubernetes integration to do stuff like secret injection and rotation automatically. We also publish a Helm chart to run Vault on Kubernetes. https://learn.hashicorp.com/vault?track=getting-started-k8s#

              We’re looking at ways to make running Vault much, much easier. More on that later this year. :)

              1. 14

                This list seems to be based on a super Frankenstein’d, incompletely applied threat model.

                There is a very real privacy concern to be had giving google access to every detail of your life. Addressing that threat does not necessitate making choices based on whether the global intelligence community can achieve access into your data — and less than skillfully applied that probably makes your overall security posture worse.

                1. 1

                  I agree that mentioning of the 5/9/14/howevermany eyes is unnecessary, and also not helpful. It’s not like if your data is stored on a server in a non-participating country that it somehow makes you more secure. All of that data still ends up traveling through the same routers on its way to you.

                  1. 1

                    If you’re going to put a whole lot of effort into switching away from Google, you might as well do it properly and move to actually secure services.

                    1. 11

                      In a long list of ways, Google is the most secure service. For some things (i.e. privacy) they’re not ideal, but moving to other services almost certainly involves security compromises (to gain something you lose something).

                      Again, it all goes back to what your threat model is.

                      1. 3

                        Google is only the most secure service if you are fully onboard with their business model. Their business model is privacy violating at the most fundamental level, in terms of behavioral surplus futures. Whatever your specific threat model it then becomes subject to the opacity of Google’s auction process.

                  1. 2

                    Not sure what a misunderstanding of Marie Kondo has to do anything, but sure, go off on one about charlatans… This just feels like reactionary bullshit (rather than anything revolutionary) with a total failure to understand what has changed about the world outside the computer and what people’s relationships with computers is.

                    1. 1

                      What, in your opinion, has changed about people’s relationship to computers since 1990 that justifies the complexity of, say, web browsers?

                      1. 4

                        The fact that on the web, you can buy almost any product you want, transfer your money around, plan and book your next vacation, (in some countries) you can register a car, your children and even yourself, or in other words, the “world wide web” has become far more integral and it’s requirements have grown from becoming a markup protocol to one of the main communication media of our global society.

                        I’m not sure if you would say it “justifies” it, but I do think that it explains it.

                        1. 1

                          Absolutely. We’re using the web for a wide variety of things that it’s not meant to do, and this required sticking a bunch of extra crap on top of it, and the people who stuck the extra crap on top weren’t any more careful about managing technical debt than the people who designed the thing in the first place.

                          There’s a need for secure computerized mechanisms for commerce, exchange, and identification management. Using web technologies to produce stopgap solutions has lowered the perceived urgency of supplying reliable solutions in that domain. It’s put us back twenty-five years, because (at great cost) we’ve made a lot of stuff that’s just barely good enough that it’s hard to justify throwing it all out, even when the stuff we’ve built inevitably breaks down because it’s formed around a set of ugly hacks that can’t be abstracted away.

                          1. 3

                            But that’s the thing – the web you talk of is little more than ruins that people like me or you visit and try to imitate. It has become irrelevant that the web “wasn’t meant” for most of the things it is being used today, it has been revamped and pushed towards a direction where it retroactively has been made to seem as though it had been made for it, it’s history rewritten.

                            It’s not clean, but with the prospect of a browser engine monopoly by groups like Google, and their interests, this might get improved on for better or worse.

                            The point still stands that I have made elsewhere that the web has been the most fertile space for the real needs of a computer users, both “consumers” as well as marketing and advertisement firms – one can’t just forget the last two, and their crucial role in building up the modern web into something more or less workable.

                            If we had wanted the web to be “pretty” and “clean” I think it would have had to have grown a lot more organically, slower and controlled. Instead of waiting for AOL, yahoo and gmail to offer free email, this should have been locally organised, instead of having investors thrown into a space where the quickest win, technologies and protocols should have had time to ripen, instead of having millions of people buying magical computer boxes that get better with every update, they should have had to learn what it is they are using – having have said that, I don’t believe this would have been in any way practical. Beauty and elegance don’t seem to scale to this level.

                            1. 1

                              The lack of cleanliness isn’t merely a headache for pedants & an annoyance for practitioners: it makes most tasks substantially more difficult (like 100x or 1000x developer effort), especially maintenance-related tasks, and because the foundations we’re building upon are shoddily designed with unnecessary abstraction leakages & inappropriate fixations, certain things we need to do simply can’t be done reliably. We’re plugging the holes in the boat with newspaper, and that’s the best we can do.

                              Beauty and elegance don’t seem to scale to this level.

                              Scaling is hard, and I don’t expect a large system to have zero ugly spots. But, complication tends to compound upon itself. For the first 15-20 years of the web, extremely little effort was put into trying to make new features or idioms play nice with the kinds of things that would come along later – in other words, everything was treated like a hack. Tiny differences in design make for huge differences in complexity after a couple iterations, and everything about the web was made with the focus on immediate gains, so as a result, subsequent gains come slower and slower. Had the foundational technologies been made with care, even with the kind of wild live-fast-die-young-commit-before-testing dev habits that the web generation is prone to, the collapse we are currently seeing could have been postponed by 20 or 30 years (or even avoided entirely, because without the canonization of postel’s law and “rough consensus and running code” we could have avoided ossifying ugly hacks as standards).

                    1. 2

                      It confuses me that you need to switch registries to make this work. Will GitHub proxy all packages from npm? Otherwise, how would you use some from npmjs.com and others from GH in your projects?

                      1. 4

                        using scoped packages, each “scope” can have it’s own url/registry https://docs.npmjs.com/misc/scope#associating-a-scope-with-a-registry

                        1. 2

                          If you really want it to be that you can pretend. But I don’t know why you would feel better doing so.

                          1. 11

                            I mean, his entire schtick is “Neo-Reaction”, he’s defended owning slaves, and the list absolutely goes on from there. I’m not sure why thats controversial. If you want to sift through his oeuvre for more tidbits on what he believes by all means, but denying that he believes in all different kinds of hierarchy and especially racial hierarchy is going to be a problem when you’re done.

                            1. 3

                              If you can give a pithy explanation of what urbit is really about, you’ll be the first in my experience.

                              There are some cool concepts but it seems like they are melded together to create a solution to some problem which is never clearly stated, unless the problem is “there should be a digital asset akin to land in that it is impossible to create more of it,” which isn’t a problem most people, even most people posting on various programming-oriented messageboards, would find compelling.

                              1. 1

                                “a digital asset akin to land in that it is impossible to create more of it” is actually a really interesting solution to the problem of making digital identities expensive enough to discourage spam, and also to the problem of funding the development of a social network before the social network has gotten large enough to be obviously valuable. Certainly not the only such solution, but a solution. I’d actually like to see other projects that have nothing to do with Urbit try out their own spins on the idea of cryptographically-verified digital land ownership.

                          1. 1

                            This post almost makes zero sense… apart from “buy my product” it’s pretty much a series of unconnected dots and buzzwords masquerading as cohesive argument for… something?!

                            tens of millions of requests every month ~ 4 requests per second…

                            1. 19

                              Or just … don’t register them in the first place because they’re emblematic of everything that’s wrong with western colonialism: https://gigaom.com/2014/06/30/the-dark-side-of-io-how-the-u-k-is-making-web-domain-profits-from-a-shady-cold-war-land-deal/

                              1. 0

                                they’re emblematic of everything that’s wrong with western colonialism

                                Colonialism, you say?

                                The rights for selling .io domains are held by a British company called Internet Computer Bureau (ICB) [..] The .io domains each cost £60 ($102) before taxes, or twice that if you’re outside the EU. The British government granted these rights to ICB chief Paul Kane back in the 1990s.

                                Don’t you think something like “everything that’s wrong with governments deciding who gets to profit from what” would have been more accurate?

                                All domain names are sold through some kind of state-enforced monopoly or cartel, which is also what enables the scourge that is domain squatting.

                                What’s colonialism got to do with this, besides that the islands were colonized by a .. well, there’s that word again.

                                1. -1

                                  So what’s the problem here, that some how the internet top level .io is some god-given right to some people that were born on an island? That if you were standing around on a piece of land then that land is yours from today till the end of days and that also extends to a particular set of two-characters that somebody else who invented and implemented a network had decided would be assigned some level of relevance to said piece of land? Is it also an injustice if i somehow implemented my own domain resolution system and don’t give up ‘io’ to these islanders? Do you have a list of strings that I need to give to people so I don’t oppress them?

                                  The amount of desperation that people have to fall for the ‘poor-innocent-uncivilized-islanders-slash-villagers-being-oppressed-by-evil-white-people’ narrative continues to boggle to mind.

                                  1. 11

                                    how fucking stupid or wilfully racist do you have to be to not see that ‘poor-innocent-uncivilized-islanders-slash-villagers-being-oppressed-by-evil-white-people’ is quite literally the exact thing that happened here?

                                1. 4

                                  It’s really 2 companies in terms of analytics…

                                  • Pusher: push messaging / websockets as-a-service
                                  • Intercom: in browser messaging as-a-service
                                  • Launch Darkly: A/B testing + feature flags as-a-service
                                  • Amplitude: analytics
                                  • Appcues: in browser messaging and help / UI hints as-a-service
                                  • Quora (??): yeah… what is even happening here?
                                  • elev.io: support / help docs as-a-service
                                  • Optimizely: analytics
                                  1. 1

                                    Optimizely is A/B testing as well.

                                  1. 1

                                    It is a shame that this paper seems not to address the privacy implications of what information could be deduced by the external service providers (and their apps) through the use of procrastinator. and/or what private information ‘procrastinator’ would need to function as intended.

                                    1. 2

                                      I’m not really sure what this article is trying to say or do other than just be marketing puff piece?

                                      1. 2

                                        The title is funny, though. I give it the win on mocking the “serverless” term.

                                      1. 1

                                        Here’s a script to help with the backwards incompatible configuration changes: https://github.com/fgsch/varnish3to4

                                        Note to self: never ever disrespect your open source projects' users like that. Either provide a stable public API or the tools to automate the migrations.

                                        1. 1

                                          Isn’t that what major versions are for?

                                          1. 1

                                            In my ideal world major versions are for major features, not major breakage.

                                            1. 3

                                              And stay stuck with mistakes forever? Is the only solution a fork? Majors aren’t meant to be backwards-compatible.

                                              1. 1

                                                Bug fixing and API stability are orthogonal concepts, aren’t they? Unless you mean mistakes in the API itself in which case I’ll remind you that no one really suffers from the typo in “HTTP referer”. And for more serious API design mistakes the proper punishment would be to keep maintaining the old version forever in addition to the new stuff you just added.

                                                1. 3

                                                  Well, semver would disagree with you about never breaking backwards compat on API.

                                                  Given a version number MAJOR.MINOR.PATCH, increment the:
                                                  
                                                  MAJOR version when you make incompatible API changes,
                                                  MINOR version when you add functionality in a backwards-compatible manner, and
                                                  PATCH version when you make backwards-compatible bug fixes.
                                                  
                                                    1. 1

                                                      That was the exception and not the example. Torvalds just thought there were enough features to warrant a new release and 2.6.40 seemed to be pushing it. http://thread.gmane.org/gmane.linux.kernel/1147415

                                                      1. 1

                                                        Well, in my ideal world (which someone thought it’s “-1 incorrect” :-) ) major versions for major features is the rule, not the exception. The Linux kernel is a good example of (public) API stability and we should learn from it.

                                        1. 1

                                          Hi I’m James Butler, Engineer who occasionally writes appalling javascript and does operations stuff when no one more grown up is looking.