1. 13
  1.  

  2. 13

    I have other things going on in the pixel mines, but a couple parts of this I don’t think illustrate the points the author wants to make.

    But this criticism largely misses the point. It might be nice to have very small and simple utilities, but once you’ve released them to the public, they’ve become system boundaries and now you can’t change them in backwards-incompatible ways.

    This is not an argument for making larger tools–is it better to have large and weird complicated system boundaries you can’t change, or small ones you can’t change?

    While Plan 9 can claim some kind of ideological purity because it used a /net file system to expose the network to applications, we’re perfectly capable of accomplishing some of the same things with netcat on any POSIX system today. It’s not as critical to making the shell useful.

    This is a gross oversimplification and glossing over of what Plan 9 enabled. It wasn’t mere “ideological purity”, but a comprehensive philosophy that enabled an environment with neat tricks.

    The author might as well have something similar about the “ideological purity of using virtual memory”, since some of the same things can be accomplished with cooperative multitasking!

    1. 4

      This is a gross oversimplification and glossing over of what Plan 9 enabled. It wasn’t mere “ideological purity”, but a comprehensive philosophy that enabled an environment with neat tricks.

      Not only tricks, but a whole concept of how ressources can be used: Use the file storage on one system, the input/output (screen, mouse, etc.) of another and run the programs somewhere with a strong cpu, all by composing filesystems. Meanwhile in 2018 we are stuck with ugly hacks and different protocols for everything, trying to fix problems by adding another layer on top of things (e.g. pulseaudio on top of alsa).

      And, from the article:

      And as a filesystem, you start having issues if you need to make atomic, transactional changes to multiple files at once. Good luck.

      Thats an issue of designing the concrete filesystem, not of the filesystem-abstraction. You could write settings to a bunch of files which are together in a dictionary and commit them with a write to a single control file.

      Going beyond text streams

      PowerShell is a good solution, but the problem we have with pipelines on current unix-style systems isn’t that the data is text, but that the text is ill formatted. Many things return some cute markup. That makes it more difficult to parse than necessary.

      1. 3

        Actually Unix proposed the file as an universal interface before Plan 9 was a dream.
        The issue was that that temporary convenience and the hope that “worse is better” put Unix in a local minimum were that interface was not universal at all (sockets, ioctl, fctl, signals…).
        Pike tried to escape such minimum with Plan 9, where almost every kernel and user service is provided as a filesystem and you can stack filesystems like you compose pipes in Unix.

        1. 10

          Put a quarter in the Plan9 file-vs-filesystem “well actually” jar ;)

      2. 7

        Nobody should ever really be criticizing tar zxvf for doing what everyone reaches for tar to do most of the time: decompress a tar.gz file. Taking the most common case that people have to deal with and requiring them to reach for a more complicated combination of tools, instead of just directly addressing the problem, isn’t good design either.

        Actually I think that forcing people to think might be the moral things to do.

        Learning how to compose small programs to solve your own problems is a useful skill, that goes beyond computer.

        While I have been raised as a web developer reading “Don’t make me think” a couple of times, while I’m all for accessibility (in the original meaning), usability and software ergonomics, I recently realised that people need to understand computers much more deeply to reduce their dependency on big corporations that use them as laboratory mice.

        So software should be ergonomic, should feel simple and intuitive (after reading the manual), and there should always be one obvious way to compose tools to perform a task.

        But good software should also improve the users, ideally turning them to hackers.

        Good software should foster creativity, curiosity and deep thinking.

        1. -1

          It is immoral to force people to do things.

          1. 4

            You mean… like in elementary school?

            If forcing people to think is immoral, what about tools that reduce their ability to think?

        2. 3

          The author’s views on pragmatism (do one thing well, but maybe do these other related things too) are a good reflection of the “real” state of Unix. The Unix “philosophy” is not so much the guiding thought process that defined the creation of Unix as a collection of rationalisations to explain the things that were created, but every description of the philosophy has to silently ignore, or apologise for, the outliers. Yes, it’s impure that tar has compression support, but also it works differently from ar when both are archive tools. Why does find not look like a Unix tool, why does X not look like a Unix tool (it was a port), why does ls have so many options about tabulating and sorting output, you hand-wave past them or ignore them. Here the answer is the one that explains the rest of Unix’s success: that worse is better.

          1. 0

            Yes. The whole “Unix philosophy” is post-hoc rationalization.

          2. 1

            There is a typo in the title (decontructing -> deconstructing).

            1. 2

              it’s in the article too, so i’d keep it that way here