1. 51

As page has started returning a 403, here’s an archived version:

https://web.archive.org/web/20210724084924/https://computer.rip/2021-04-03-use-computers-to-store-data.html

    1. 17

      This article has everything: databases, rad, different visions of computing as a human field of endeavor, criticism of capitalism. I commend it to everyone.

      1. 13

        Criticism of capitalism is, in theory, my main thesis, but I find it difficult to convey in a way that doesn’t get me a lot of angry emails with valid complaints, because the issue is complex and I can’t fully articulate it in a few paragraphs. But it is perhaps my core theory that capitalism has fundamentally diminished the potential of computing, and I hope to express that more in the future.

        1. 3

          But it is perhaps my core theory that capitalism has fundamentally diminished the potential of computing, and I hope to express that more in the future

          I am on a team that is making a documentary about the history of personal computing. One of the themes that has come out is how the kind of computing that went out to consumers in the early 80s on was fundamentally influenced by wider socioeconomic shifts that took place beginning in the 70s (what some call a “neoliberal turn”). These shifts included, but were not limited to, the elevation of shareholder primacy and therefore increased concentration on quarterly reports and short-termism.

          These properties were antithetical to those that led to what we would say were the disproportionate advances in computing (and proto-personal computing) from the 50s, 60s, and 70s. Up until the 80s, the most influential developments in computing research relied on long-term, low-interference funding – starting with ARPA and ultimately ending with orgs like PARC and Bell Labs. The structures of government and business today, and for the past few decades, are the opposite of this and therefore constitutionally incapable of leading to huge new paradigms.

          One more note about my interviews. The other related theme that has come out is that what today we call “end user programming” systems were actually the goal of a good chunk of that research community. Alan Kay in particular has said that his group wanted to make sure that personal computing “didn’t become like television” (ie, passive consumption). There were hints of the other route personal computing could have gone throughout the 80s and 90s, some of which are discussed in the article. I’d add things like Hypercard and Applescript into the mix. Both were allowed to more or less die on the vine and the reasons why seem obvious to me.

          1. 1

            These properties were antithetical to those that led to what we would say were the disproportionate advances in computing (and proto-personal computing) from the 50s, 60s, and 70s. Up until the 80s, the most influential developments in computing research relied on long-term, low-interference funding – starting with ARPA and ultimately ending with orgs like PARC and Bell Labs. The structures of government and business today, and for the past few decades, are the opposite of this and therefore constitutionally incapable of leading to huge new paradigms.

            This is something I’ve been thinking about a while - most companies are incapable of R&D nowadays; venture capital funded startups have taken a lot of that role. But they can only R&D what they can launch rapidly and likely turn into a success story quickly (where success is a monopoly or liquidity event).

        2. 3

          As with so many things. But I think mass computing + networking and our profession have been instrumental in perfecting capitalism.

          Given the values that were already dominating society, I think this was inevitable. This follows from my view that the way out is a society that lives by different values. I think that this links up with our regularly scheduled fights over open source licenses and commercial exploitation, because at least for some people these fights are at core about how to live and practice our craft in the world as it is, while living according to values and systems that are different from our surrounding capitalist system. In other words, how do we live without being exploited employees or exploiting others, and make the world a marginally better place?

        3. 2

          Complaints about the use of the word, and maybe calling you a socialist, or something?

          I wouldn’t do that to you, but I do have to “mental-autocorrect” capitalism into “We may agree software developers need salary and some SaaS stuff is useful, but social-media-attention-rent-seekers gain power, which sucks, so he means that kind of ‘capitalism’”.

          There should be a word, like cronyism is the right word for what some call capitalism, or a modifier like surveillance capitalism.

          1. 3

            But I am a socialist. The problem is this: the sincere conviction that capitalist organization of global economies has diminished human potential requires that I make particularly strong and persuasive arguments to that end. It’s not at all easy, and the inherent complexity of economics (and thus of any detailed proposal to change the economic system) is such that it’s very difficult to express these ideas in a way that won’t lead to criticism around points not addressed, or of historic socialist regimes. So is it possible to make these arguments about the history of technology without first presenting a thorough summary of the works of Varoufakis and Wolff or something? I don’t know! That’s why I write a blog about computers and not campaign speeches or something. Maybe I’m just too burned out on the comments I get on the orange website.

            1. 1

              Sure, I appreciate that, though it would maybe attract bad actors less if there was some thread of synopsis that you could pull on instead of “capitalism”.

              I think the problem is broad terms, because they present a large attack surface, though I do realize people will also attack outside the given area.

              I’m also saddened by a lot of what’s going on in ICT, but I wouldn’t attribute it blindly to “capitalism”, but I don’t have all the vocabulary and summaries, if you will, to defend that position.

              One’s capitalism is anyway different from another’s, so the definitions must be laid out. Maybe all of Varoufakis isn’t needed every time?

              Nor am I convinced we’ll reach anything better with socialist or other governmental interventions. An occasional good law may be passed, or money handouts that lead to goodness, but each of those will lose in the balance to detrimental handouts/malfeasance, corruption, unintended consequences, and bad laws.

              Maybe some kind of libertarian-leaning world where people have a healthy dose of socialist values but are enlightened enough to practice them voluntarily?

        4. 1

          I would love to see the hoops you jump through to express that. Honestly. It seems so alien to my worldview that anyone making that claim (beyond the silly mindless chants of children which I’m assuming is not the case here) would be worth reading.

          1. 8

            I’ve made a related argument before, which I think is still reasonably strong, and I’ll repeat the core of here.

            In my experience, software tends to become lower quality the more things you add to it. With extremely careful design, you can add a new thing without making it worse (‘orthogonal features’), but it’s rare that it pans out that way.

            The profit motive drives substantial design flaws via two key mechanisms.

            “Preventing someone from benefiting without paying for it” (usually means DRM or keeping the interesting bits behind a network RPC), and “Preventing someone from churning to another provider” (usually means keeping your data in an undocumented or even obfuscated format, in the event it’s accessible at all).

            DRM is an example of “adding a new thing and lowering quality”. It usually introduces at least one bug (sony rootkit fiasco?).

            Network-RPC means that when your network connection is unreliable, your software is also unreliable. Although I currently have a reliable connection, I use software that doesn’t rely on it wherever feasible.

            Anti-churn (deliberately restricting how users use their own data) is why e.g. you can’t back up your data from google photos. There used to be an API, but they got rid of it after people started using it.

            I’m not attempting to make the argument that a particular other system would be better. However, every human system has tradeoffs - the idea that capitalism has no negative ones seems ridiculous on the face of it.

            1. 1

              Those are shitty aspects to a lot of things, and those aspects are usually driven by profits, although not always the way people think. I’ll bet dollars to donuts that all the export features Google removes are done simply because they don’t want to have to support them. Google wants nothing less than they want to talk to customers.

              But without the profit motive in the first place, none of these things would exist at all. The alternatives we’ve thought up and tried so far don’t lead to a world without DRM, they lead to a world where media is split into that the state approves and nobody wants to copy, and that where possession of it gets you a firing squad, whether you paid or not.

              1. 6

                But without the profit motive in the first place, none of these things would exist at all.

                It’s nonsensical to imagine a world lacking the profit motive without having any alternative system of allocation and governance. Nothing stable could exist in such a world. Some of the alternative systems clearly can’t produce software, but we’ve hardly been building software globally for long enough to have a good idea of which ones can, what kinds they can, or how well they can do it (which is a strong argument for the status quo).

                As far as “made without the profit motive” go, sci-hub and the internet archive are both pretty neat and useful (occupying different points on the legal-in-most-jurisdictions spectrum). I quite like firefox, too.

          2. 3

            “Capitalism” is a big thing which makes it difficult to talk about sensibly, and it’s not clear what the alternatives are. That said, many of the important aspects of the internet were developed outside of commercial considerations:

            • DARPA was the US military

            • Unix was a budget sink because AT&T wasn’t allowed to go into computing, so they just shunted extra money there and let the nerds play while they made real money from the phone lines

            • WWW was created at CERN by a guy with grant money

            • Linux is OSS etc.

            A lot of people got rich from the internet, but the internet per se wasn’t really a capitalist success story. At best, it’s about the success of the mixed economy with the government sponsoring R&D.

            On the hardware side, capitalism does much better (although the transistor was another AT&T thing and NASA probably helped jumpstart integrated circuits). I think the first major breakthrough in software that you can really peg to capitalism is the post-AlphaGo AI boom, which was waiting for the GPU revolution, so it’s kind of a hardware thing at a certain level.

            1. 2

              I still disagree, but man it’s nice to just discuss this sort of thing without the name-calling and/or brigading (or worse) you see on most of the toobs. This sort of thing is pretty rare.

          3. 2

            Obviously not op, but observe the difference between the growth in the scale and distribution of computing power, and what has been enabled, over the last 40 years.

            Business processes have been computerized and streamlined, entertainment has been computerized, and computerized communications especially group communications like lobsters or other social media have arrived. That’s not nothing, but it’s also nothing that wasn’t imaginable at the start of that 40 year period. We haven’t expanded the computer as a bicycle of the mind - consider simply the lack of widespread use of the power of the computer in your hand to create art. I put that lack of ambition down to the need to intermediate, monetize, and control everything.

            1. 1

              And additionally the drive to drive down costs means we have much less blue sky research and ambition; but also means that things are done to the level that they’re barely acceptable. We are that right now with the security situation: everything is about playing whackamole quicker than hackers, rather than investing in either comprehensive ahead of time security practices or in software that is secure by construction (whatever that would look like).

        5. 1

          What I used to tell them is it’s basically a theory that says each person should be as selfish as possible always trying to squeeze more out of others (almost always money/power), give less to others (minimize costs), and put as many problems on them as possible (externalities).

          The first directly leads to all kinds of evil, damaging behavior. There’s any number of schemes like rip-offs, overcharging, lockin, cartels, disposable over repairable, etc. These are normal, rather than the exception.

          The second does every time cost-cutting pressure forces the company to damage others. I cite examples with food, medicine, safety, false career promises, etc. They also give less to stakeholders where fewer people get rewarded and get rewarded less for work put in. Also, can contrast to utilitarian companies like Publix that gives employees benefits and private stock but owners still got rich. Or companies that didn’t immediately lay off workers during recessions. An easy one is most can relate to is bosses, esp executives, paid a fortune to do almost nothing for the company vs workers.

          Externalities affect us daily. They’re often a side effect of the other two. Toxic byproducts of industrial processes is a well known one. Pervasive insecurity of computers, from data loss to crippling DDOS’s to infrastructure at risk, is almost always an externality since the damage is someone else’ problem but preventing it would be supplier’s. You see how apathy is built-in when the solution is permissible, open-source, well-maintained software and they still don’t replace vulnerable software with it.

          Note: Another angle, using game of Monopoly, was how first movers or just lucky folks got an irreversible, predatory advantage over others. Arguing to break that up is a little harder, though.

          So, I used to focus on those points, illustrate alternative corporate/government models that do better, and suggest using/tweaking everything that already worked. Meanwhile, counter the abuse at consumer level by voting with wallet, sensible regulations anywhere capitalist incentives keep causing damage, and hit them in court with bigger damages that preventing it would cost. Also, if going to court, I recommend showing where it was true how easy or inexpensive prevention was asking the court basically orders it. Ask them to define reasonable, professional standard as not harming stakeholders in as many cases as possible.

          Note: Before anyone asks, I don’t have the lists of examples anymore or just inaccessible. Side effect of damaged memory is I gotta stay using it or I lose it.

    2. 17

      It’s pretty absurd you need to hire a programmer to develop a simple CRUD application.

      In college, they tasked us with developing a backroom management solution for a forestry college. They were using Excel (not even Access!). One day, the instructor told us we weren’t the first - we were the second, maybe even third attempt at getting programmers to develop a solution for them. I suspect they’re still using Excel. Made me realize that maybe letting people develop their own solutions is a better and less paternalistic option if it works for them.

      Related: I also wonder if tools like Dreamweaver or FrontPage were actually bad, or if they were considered a threat to low-tier web developers who develop sites for like, county fairs…

      1. 13

        Made me realize that maybe letting people develop their own solutions is a better and less paternalistic option if it works for them.

        There’s also a related problem that lots of people in our field underestimate: domain expertise. The key to writing a good backroom management solution for a forestry college is knowing how a forestry college runs.

        Knowing how it runs will help you write a good management solution, even if all you got is Excel. Knowing everything there is to know about the proper way to do operator overloading in C++ won’t help you one bit with that. Obsessing about the details of handling inventory handouts right will make your program better, obsessing about non-type template parameters being auto because that’s the right way to do it in C++-17 will be as useful as a hangover.

        That explains a great deal about the popularity of tools like Excel, or Access, or – back in the day – Visual Basic, or Python. It takes far less time for someone who understands how forestry colleges run to figure out how to use Excel than it takes to teach self-absorbed programmers about how forestry colleges run, and about what’s important in a program and what isn’t.

        It also doesn’t help that industry hiring practices tend to optimise for things other than how quickly you catch up on business logic. It blows my mind how many shops out there copycat Google and don’t hire young people with nursing and finance experience because they can’t do some stupid whiteboard puzzles, when they got customers in the finance and healthcare industry. If you’re doing CRM integration for the healthcare industry, one kid who worked six months in a hospital reception and can look up compile errors on Google can do more for your bottom line than ten wizkids who can recite a quicksort implementation from memory if you wake them up at 3 AM.

        Speaking of Visual Basic:

        I also wonder if tools like Dreamweaver or FrontPage were actually bad, or if they were considered a threat to low-tier web developers who develop sites for like, county fairs…

        For all its flaws in terms of portability, hosting, and output quality, FrontPage was once the thing that made the difference between having a web presence and not having one, for a lot of small businesses that did not have the money or the technical expertise to navigate the contracting of development and hosting a web page in the middle of the Dotcom bubble. That alone made it an excellent tool, in a very different technical landscape from today (far less cross-browser portability, CSS was an even hotter pile of dung than it is today and so on and so forth).

        Dreamweaver belonged in a sort of different league. I actually knew some professional designers who used it – the WYSIWYG mode was pretty cool for beginners but the way I remember, it was a pretty good tool all around. It became less relevant because the way people built websites changed.

        1. 4

          It also doesn’t help that industry hiring practices tend to optimise for things other than how quickly you catch up on business logic. It blows my mind how many shops out there copycat Google and don’t hire young people with nursing and finance experience because they can’t do some stupid whiteboard puzzles, when they got customers in the finance and healthcare industry. If you’re doing CRM integration for the healthcare industry, one kid who worked six months in a hospital reception and can look up compile errors on Google can do more for your bottom line than ten wizkids who can recite a quicksort implementation from memory if you wake them up at 3 AM.

          I’ve been meaning to write about my experiences in community college (it’s quite far removed from the average CS uni experience of a typical HN reader; my feelings are complex about it), but to contextualize:

          • Business analysts were expected to actually talk to the and refine the unquantifiable “we want this” into FR/NFRs for the programmers to implement.

          • Despite this, programmers weren’t expected to be unsociable bugmen in a basement that crank out code, but also be able to understand, refine requirements, and even talk to the clients themselves. Despite this, I didn’t see much action in this regard; we used the BAs as a proxy most of the time. They did their best.

          1. 3

            I’m pretty torn on the matter of the BA + developer structure, too (which has somewhat of a history on this side of the pond, too, albeit through a whole different series of historical accidents).

            I mean on the one hand it kind of makes sense on paper, and it has a certain “mathematical” appeal that one would be able to distill the essence of some complex business process into a purely mathematical, axiomatic model, that you can implement simply in terms of logical and mathematical statements.

            At the same time, it’s very idealistic, and my limited experience in another field of engineering (electrical engineering) mostly tells me that this is not something worth pursuing.

            For example, there is an expectation that an EE who’s working on a water pumping installation does have a basic understanding of how pumps work, how a pumping station operates and so on. Certainly not enough to make any kind of innovation on the pumping side of things, but enough to be able to design an electrical installation to power a pump. While it would technically be possible to get an “engineering analyst” to talk to the mechanical guys and turn their needs into requirements on the electrical side, the best-case scenario in this approach is that you get a highly bureaucratic team that basically designs two separate systems and needs like twenty revisions to get them hooked up to each other without at least one of them blowing up. In practice, it’s just a lot more expedient to teach people on both sides just enough about each others’ profession to get what the other guys are saying and refine specs together.

            Obviously, you can’t just blindly apply this – you can’t put everything, from geography to mechanical engineering and from electrophysiology to particle physics in a CS curricula because you never know when your students are gonna need to work on GIS software, SCADA systems, medical devices or nuclear reactor control systems.

            But it is a little ridiculous that, 80 years after the Z3, you need specially-trained computer programmers not just in order to push the field of computing forward (which is to be expected after all, it’s computer engineers that push computers forward, just like it’s electrical engineers who push electrical engines forward), but also to do even the most basic GIS integration work, for example. Or, as you said, to write a CRUD app. This isn’t even close to what people had in mind for computers 60 years ago. I’m convinced that, if someone from Edsger Dijkstra’s generation, or Dijkstra himself were to rise from the grave, he wouldn’t necessarily be very happy about how computer science has progressed in the last twenty years, but he’d be really disappointed with what the computer industry has been doing.

            1. 2

              I mean, the biggest reason why salesforce is such a big deal is that you don’t need a programmer to get a CRUD app. They have templates covering nearly every business you could get into.

              1. 1

                Their mascot literally used to be a guy whose entire body was the word “SOFTWARE” in a crossed-out red circle: https://www.gearscrm.com/wp-content/uploads/2019/01/Saasy1.jpg

        2. 2

          FWIW I was neck deep in all of that back in the day. Nobody I knew looked down on Dreamweaver with any great enthusiasm, we viewed it as a specialised text editor that came with extra webby tools and a few quirks we didn’t like. And the problem with FrontPage was never that it lets noobs make web pages, just the absolute garbage output it generated that we would then have to work with.

          1. 4

            just the absolute garbage output it generated that we would then have to work with.

            Oh, yeah, the code it generated was atrocious, but the point was you never had to touch it. That was obviously never going to work for serious web design work, but not everyone needed or, for that matter, wanted any of that. FrontPage was remarkably popular at the university I went to for precisely this reason. Nobody in the EE department knew or cared to learn HTML, they just wanted something that made it easy to hyperlink things. Something that they could use sort of like they used Microsoft Word was even better.

            Nobody I knew looked down on Dreamweaver with any great enthusiasm, we viewed it as a specialised text editor that came with extra webby tools and a few quirks we didn’t like.

            I was definitely not neck-deep in it at the time Dreamweaver was being popular-ish so there’s not much I can add to that, other than that I think this was kind of the vibe I’d pick up from anyone who already knew a “serious” text editor. The guy who first showed me emacs would’ve probably said more or less the same thing. I suppose if all you’d seen before was Notepad, it would be easy to get hooked on it – otherwise there wasn’t much point to it.

            That being said, there were a bunch of serious web shops that were using it in my area. I’d seen them around the turn of the century and it popped up in job ads every once in a while. Later, I started to sort of work for a computer magazine, and my colleagues who handled advertising knew that Macromedia had a small (and dwindling, but this was already 2002 or so…) customer base for Dreamweaver around here

      2. 3

        re: “related” — hmm, these days services like Squarespace and Wix are not really considered bad, and it’s not uncommon for a web developer to say to a client they don’t want to work with: “your project is too simple for me, just do it yourself on Squarespace”. I wonder what changed. The tools have, for sure — these new service ones are more structured, more “blog engine” than “visual HTML editor”, but they still do have lots and lots of visual customization. But there must be something else?

        1. 3

          I have found that things like Wix and Squarespace (or Wordpress) don’t scale very well. They work fine for a few pages that are manageable, but when you want to do more complex or repetitive things (generate a set of pages with minor differences in text or theme) they obstruct the user and cost a lot of time. A programmatic approach would then be a lot better, given that the domain is well mapped out.

    3. 10

      my sister just asked me what she should replace her homegrown Access (inherited) CRM with earlier this week … I suppose I should send her a link to this article 😉

    4. 10

      An interesting factor in the demise of the desktop database that wasn’t really mentioned in the otherwise excellent article:

      If the premise of Access and the like is that you can port your existing (paper based) business processes and records to it, well… the world has kinda run out of those things for the most part. We’ve run out of old organizations to computerize. New organizations are gonna start out by, of course, looking for more pre-made tools rather than the more freeform ones. They don’t have processes yet, so they look for pre-made tools to discover which processes are common, which ones suit their needs. And I suppose it’s quite rare that would they end up in the spot where these tools don’t have enough customization already but it’s way too early for fully custom development.

      1. 7

        This is a great thought and I think it is an important part of the puzzle. My father, who worked in corporate accounting, once made an observation something like this: early in computerization the focus was on using computers to facilitate the existing process. Later on, roughly around the consolidation of the major ERPs (Oracle and SAP), it became the opposite: the business adjusted their processes to match the computer system. You can either put a positive spin on this (standardization onto established best practices) or a negative spin (giving in to what the software demands). It’s also not completely true, as products like Oracle and SAP demonstrate with their massive flexibility.

        But I think there’s an important kernel of truth: there’s not a lot of talk of “computerize the process” these days. Instead, it’s usually framed as “let’s get a new process based on a computer.” That means there’s fundamentally less need for in-house applications. I don’t think it’s clearly a bad thing either, but it definitely has some downsides.

    5. 8

      To be somewhat cynical (not that that is new), the goal of the ’10s and ’20s is monetization and engagement. Successful software today must be prescriptive, rather than general, in order to direct users to the behaviors which are most readily converted into a commercial advantage for the developer.

      I agree with the article but just to play devil’s advocate, I think the death of these halfway “it’s not programming but they’re still flexible building blocks” attempts at democratized personal computing ran into two realities:

      1. Normal people don’t want to program, not even a little bit, because it tends to get complicated fast. GUIs are more intuitive and picking one up for its intended task is usually very fast. UX research has come a long way.
      2. Programming is actually way more popular than ever, and when people do pick it up, they tend to want to go all the way, using Python, or even “enterprise” languages. And if you do just pick up python and learn how to start doing the things you want to use it for, the world has only gotten friendlier: cheap cloud hosting, heroku, a golden era of open source software, raspberry pis, and so on.
      1. 8

        There’s also a lot more consumer computing too. People who bought computers in the 80’s were likelier to use it for business stuff, even a small one. My parents bought a computer for the first time in ~1997 solely for the internet and CD-ROMs like Encarta or games. They’d have no use for databases, and wrote with a pen instead of a word processor.

        Also as a counterpoint: As much as SaaS is reviled, it also does deliver in its absolution of many responsibilities like maintenance, providing a readymade service, and being a service instead of liability for taxation.

    6. 4

      I get a 403 Forbidden.

      1. 7

        Me too now. As they themselves say:

        As a Professional DevOps Engineer I am naturally incredibly haphazard about how I run my personal projects.

        😁

      2. 6

        Sorry, I think my Private Cloud has a bad power supply which is having a knock-on effect of upsetting the NFS mounts on the webserver. I’m acquiring a replacement right now, and in the meantime I am going to Infrastructure-as-Code it by adding a cronjob that fixes the NFS mount.

      3. 1

        Me too.

        You can use https://archive.md/lFfIn

    7. 4

      I’m not sure it was a conspiracy to keep the money rolling in, I think it all just turned out to be a lot harder than we thought. There’s a gap from “one thing on my desktop that only I use and maintain” to “something multiple users can access remotely”, and that gap is bigger that it appears to people on either side.

      Another angle to take is that even if the author is correct, and we have less desktop software because users refuse to pay more than once to have somebody else update their platform, it’s not a conspiracy against the users, it’s the users themselves that made the choice.

      If you want updates, if you want bugfixes, if you want the software patched when your OS vendor deprecates an API, somebody has to do that work. If they’re not doing it out of love, which is fine, you have to pay them somehow. OSS has its place, but it can never be all software.

      Just the same way we can have both home cooking for loved ones and commercial fine dining is still a thing, you need to remember that McDonald’s is also still a thing.

      1. 10

        I’m not sure it was a conspiracy

        You don’t need malicious intent to explain the effects described in the article; emergent behavior causes it given the necessary pathological incentive structures.

    8. 3

      The general form of Conway’s Law is that form of organization determines what software it can produce.

      What form of organization will produce the user-empowering tools we desire?

    9. 2

      While I find this well written and interesting, I don’t agree with this article at all. There are plenty of no-code solutions today to accomplish the same goals of databases in the past. Additionally, developing software is more accessible than ever - learning to build a web app is no more difficult than building a form in Access back in the day (though you don’t even need that - just use google forms or something similar). Finally, the move to subscription based models is just reflective of the fact that people need their app run on someone else’s computer, and that has ongoing cost associated. Practically that’s less expensive and more convenient than older solutions. I think this thesis is just nostalgia.

      1. 5

        Finally, the move to subscription based models is just reflective of the fact that people need their app run on someone else’s computer

        Do they? Or do the apps have some not-really-business-essential “cloud” feature to justify that upkeep? Seems like we managed okay before.

        Overall I think it’s about right. There have been some genuine improvements but overall, compared to a quarter century ago, we’re using hardware that’s 100x as powerful to run apps that are far slower, less reliable, and missing basic features that they used to have.

      2. 3

        I think no-code is starting to have a comeback with airtable and the like, but there certainly felt like there was a dry spell at the very least