I assume this is real, and not satire.
I worked at Google for a while, and the one thing that really struck me,
aside from the terrible way they treat their non-tech service-work
employees, is the way so many engineers lived in this bubble of
distorted reality. This post is redolent with that Silicon Valley
mindset. If this class of thinkers had their way, one wouldn’t be able
to wipe one’s butt without a high-speed connection to the Internet
uploading an analysis of the fecal matter to some cloud service operated
 As these things go, there are very many far worse places to do
service work than Google. But if you’re a Google employee, you should
get to know some of the non-tech staff and get an idea of how they live
and work. There is a bifurcated working class there, engineers and
Some years ago I was working at a health-care company which did a lot of things “old fashioned”, meaning by phone, fax, or even physical mail, and a recurring question the tech staff asked of upper management was why we didn’t have a mobile app. I remember in particular one meeting where that was asked, and a C-level explained — not scornfully, but patiently and getting the point across — that a lot of the people we were trying to help were in situations where they would have to choose which utility bill to pay off each month and which one to leave unpaid and risk a shutoff, and as a result smartphone apps would be unlikely to be of much use to them.
On the other hand, an app (be it native or web) would provide one interface that could be used by blind people, deaf people, mobility-impaired people, etc., as well as everyone else who can afford some kind of computer or access one temporarily.
I think that the big change that’s needed to make this possible is deconflating two core ideas:
Dev containers provide a good solution to the second of these but it’s really hard to compose them well with per-user customisation that’s shared across multiple projects.
Thin clients have been a fad before, are seemingly becoming a fad again currently, and likely will be a fad at other points in the future. This is not a portent of doom or of a “war on general-purpose computing”.
I don’t think the earlier fad cycle was zero-sum. “General-purpose computing” is much more niche than it was; pretty much everything personal (email, entertainment, word processing) breaks for large numbers of people when they don’t have internet. To the extent enterprise users are insulated from this, it’s more by accident than design: they’re stuck in the past with Exchange’s sync model. The further fast internet penetrates, the less financial sense it makes to prioritize offline working.
A seems like a nasty concept, I would guess there are some old-skool people that used to work on mainframes that can enlighten us in regards of.how this concept can screw ur over.
Personally I really would like to not use vscode as my editor, but there is no denying the convenience of this approach
Sometimes you just need a bad example, like the father in Berenstain Bears.
The author has miscategorized the tweet about the “soul-stifling metaverse” so profoundly that I am helpless to do anything other than flag this post. Sorry, user. :(
Seems pretty plausible. As the article notes, it’s currently valuable for high-security environments (enterprise, government, finance) but there’s also a lot of benefit for anyone with a lot of junior developers; getting a well-configured dev environment is very fiddly. Sounds like it’ll show up in schools and bootcamps and become a pretty standard practice from there. There’s a lot of plumbing infrastructure for managing VMs (technically and organizationally) that I think will take a decade or so to settle out, but when a trend like this is valuable to both the very top and very bottom of the market it’s probably going to do well for itself.
The title’s hyperbolic, though. Local dev will never die in the same way old programming languages rarely die. They shrink, but mostly they get relatively eclipsed by how much bigger everything new grows to be.