1. 50
    1. 7

      This is a great article! Looking at it from a security perspective, modern hardware is an absolute dumpster fire: whom do you have to trust for the damn thing to even boot correctly, let alone do anything useful?

      1. 8

        Looking at it from a security perspective, modern food is an absolute dumpster fire: whom do you have to trust for the damn stuff to not kill you, let alone provide any nutritional value?

        1. 3

          Touché. I believe there’s some nuance here (the backlash when bad things happen in food security are much more important, and the supply chain is generally pretty well understood), but in essence this could probably be said about any modern $THING. But I reject the fact that it’s a necessary complexity, especially in computers, since we control the thing down to the atoms used to make up the transistors.

          1. 2

            But I reject the fact that it’s a necessary complexity, especially in computers, since we control the thing down to the atoms used to make up the transistors.

            I personally would agree with you.

            However, this becomes necessary complexity when we look at the progress people want in their lives. Inventors of tomorrow begin with today’s tech stack, not yesterday’s, which means we keep assuming that everything present today must be there. Just the XKCD comic, we don’t ever ask why one tiny rectangular block is needed to hold everything up, we just start at the top and keep going. It works fine when things are fine, but you’re only adding more pieces on to troubleshoot when things break.

            The only way I see this ending is when customers realize building on the existing tech stack isn’t a quicker means to progress, but an inherent liability and start paying for simplicity. This isn’t free, though, the customer will be making tradeoffs that I just don’t see our current culture writ large wanting to make (I have to install dependencies and not just docker up?! What is this, 2005?).

            Think of the current trend towards touch screens in cars that was discussed here a while back. It lets car manufacturers produce a common part for more cars (lowering costs) and it looks slicker (getting more people to want it), but it reduces the driver’s abilities to interact with the controls without looking (decreasing attention to the road) is far more likely to have issues than a physical knob (which doesn’t need a few thousand lines of code to control my volume), and potentially opens up attack surface. So we can see this pattern of obscuring all the things you’re building on and assuming they work from the chips up through entire systems.

            1. 4

              May years ago, I was at a talk by Alan Kay, where he described progress in computing as a process of adding new abstraction layers on top and then collapsing the lower ones into thinner things to support the new tops. I always felt that he was overly (and uncharacteristically) optimistic, since I’ve seen a huge number of cases of people doing the first step of this and very few of the second.

              1. 2

                Part of the problem is if you collapse a lower layer, people come complaining because they were using it for something. Look at something like PGP vs. age. Age is neat, but the people who use PGP aren’t going to stop using it just because age exists, since it isn’t exactly the same as PGP, so it doesn’t do quite the same things. Better is different, and different is worse, so better is worse. :-)

              2. 1

                There’s also a good talk by Brian Cantrill what this does to a system’s debuggability, and it isn’t great…

                Found it

            2. 2

              Inventors of tomorrow begin with today’s tech stack, not yesterday’s, which means we keep assuming that everything present today must be there.

              It’s not an assumption, it’s a chicken/egg problem -you write for the platform where the users are, not the platform that’s good.

              Suppose you’re writing a commandline application. Commandlines are mostly running on the terminal emulator, which are literally emulating a specific piece of 1970s hardware. It’s why the Helix Editor (a vim-like (or rather, Kakoune-like) editor written in Rust starting in ~2021, very modern) can’t detect ctrl-/, and currently requires you to bind to ctrl-7 as a workaround (it can’t detect ctrl-/, but ctrl-7 emits the same keycode as ctrl-/ so it can’t not interpret ctrl-/ as ctrl-7).

              Anyway, point is that the terminal emulator is garbage. Suppose you want to write a commandline program that’s not reliant on the terminal?

              Well, who’s actually going to use the program? Arcan users? Arcan is awesome, but I don’t know if there are any actual real-world users, and if they are they’re a niche within a niche.

              But why does everyone use the terminal emulator in the first place? Well, this is sounding awfully like a page on the osdev wiki, but basically it’s because it’s “pragmatic” and that serious distros “shouldn’t try to boil the ocean” and such. Or perhaps more cynically, it’s because nobody prioritizes it highly - the point of a platform isn’t to be good, it’s to be available for users to do things they care about,

    2. 4

      There’s a very nice USENIX keynote by Timothy Roscoe on the topic and how operating systems like Linux are really limited to operating a very small part of the cores : https://www.usenix.org/conference/osdi21/presentation/fri-keynote