1. 25
  1.  

  2. 15

    Doug McIlroy

    I’m gonna name-drop here. The high point of my entire open source career was Doug McIlroy emailing me to tell me he liked what I’d done with the sam text editor. I fanboyed out and, I’m sure, made a fool of myself in the response email.

    1. 6

      What did you do with sam?

      Edit: Oh, right. Pretty awesome! http://www.deadpixi.com/an-updated-version-of-sam

      1. 3

        Thank you. It’s languished recently. Work has been taking an enormous amount of time (we went from “stealth startup mode” to “actual company” mode) and I’ve, you know, had another kid since I started working on it.

        I move it forward briefly here and there but the terminal side of things is getting harder and harder to maintain as the world moves on (it doesn’t honor any modern desktop standards, has a weird way of dealing with the clipboard, relies on Xt but not in a normal way, etc).

        In the background I’m slowly rewriting the terminal side of things in a more modern way. The host part of sam is pretty much static, though I’ve made some modifications there (switching to a much simpler gap buffer implementation since modern OS’s have trustworthy virtual memory implementations). The host part is also getting an automatically-compacting journal that works better in some edge cases (again possible due to better OS support these days).

      2. 3

        First, @lorddimwit that’s pretty close to a hex check from Knuth, If the guy that invented Unix pipes likes the changes that you made, to a text editor that was built on the fact that other editors weren’t using enough other Unix tools, then you deserve to do the occasional name-drop.

        Secondly, I’m not sure how I missed your version of sam, I’m looking forward to checking it out!

      3. 10

        Expect the output of every program to become the input to another, as yet unknown, program.

        This is one reason I really liked using powershell on Windows. Having outputs be objects made it so much easier to slice up and pipe around data.

        1. 2

          Agreed, and there’s no reason we can’t have object pipelines on UNIX as well. Explaining why this would be a desirable thing to UNIX people is hard though, as they’re just like “Yeah, but EVERYTHING IS A STREAM OF BYTES!” and don’t actually open themselves to a new idea.

          Now that MS has official PowerShell ports for Linux I’ve been meaning to play around and see if I could write my own programs that speak object pipeline. I’d think you could, and it’d almost be neat to write wrappers around the standard UNIX utils to parse their output into objects for easier consumption.

          1. 4

            Objects are just bytes too. ‘Struct pipelines’ might convey the idea better.

            1. 4

              AFAIK, it seems VM/CMS pipes have been the only one to handle this in a more generalized way:

              • requires explicit steps/programs in the pipeline to handle IO
              • only operates on logical records.
              1. 3

                I was gonna be kind of dismissive of this mechanism until I read about the fact that REXX could manipulate the pipelines and be used as glue to string random programs which aren’t pipeline aware together in interesting ways.

                I love REXX. Loved it since I used it on the Amiga :)

                Also holy crap there’s an IBM 370 emulator out in the world now :) We are living in the future people :) We should all set up a clone of BITNET :) http://www.hercules-390.org/

              2. 3

                My first job out of grad school was a windows shop and all the scripting on the application infrastructure side for testing was PowerShell. The reason they get away with piping actual objects is that they use the .net runtime. I’m not sure how this would work with a basic posix environment. You’d need in essence to recreate the parts of the .net runtime that PowerShell uses to pipe real objects instead of streams of bytes.

                Note: Even in pure PowerShell you have cases where a cmdlet doesn’t understand the types on the left side of the pipe and in this case the object is seralized to a byte stream just like in Bash and other shells.

                1. 2

                  Actually why recreate? I thought the .Net core had been released as open source and now runs on Linux as well?

                  1. 2

                    It does but you will have to interoperate with the outside environment which still deals with byte streams. So a more useful way to phrase the potential hardship I see with your approach is “extending” .net to cover enough of the external environment that makes the types line up so that everything in a pipeline is dealing with objects instead of byte streams. I think this is doable but probably requires writing a lot of serializers/deserializers to marshal everything across the .net boundary.

                    1. 2

                      Right. I’m less interested in extending the .Net environment and more interested in seeing how native POSIX programs can be written to communicate using object pipelines.

                2. 3

                  The elvish shell allows pipes to carry structured data.

                  1. 1

                    Very cool! It would be interesting to compare Powershell’s object pipeline’s data structure support as opposed to the Elvish shell.

                3. 1

                  It’s interesting to see that Postel’s Law is a corollary.

                4. 7

                  What I love about Unix is the philosophy of “do one thing well” and “expect the output of every program to become the input to another”.

                  This is mostly a myth. Anyone know how many flags and options find and friends have? I agree piping is a nice way to compose tools but the philosophy about “do one thing well” was always a lie. I understand why the myth is propagated because it’s a good enough zeroeth order aproximation but we do the newcomers a disservice because articles like this hide the dissonance inherent in unix systems.

                  Unix is just one system that survived and there is no reason to elevate it above other and arguably better systems that didn’t survive.

                  1. 9

                    This is mostly a myth. Anyone know how many flags and options find and friends have?

                    If you take a pattern to its extreme it becomes an anti-pattern. The node ecosystem, for example, has taken “do one thing well” and turned it into a dependency hierarchy nightmare. The implied pattern is more “as small a feature set as possible, but no smaller”

                    Unix is just one system that survived and there is no reason to elevate it above other and arguably better systems that didn’t survive.

                    While there are many factors that play a part in something surviving, its survival is certainly a useful heuristic about its usefulness. It may not have been the Best, but it was certainly Good Enough.

                    1. 2

                      That’s a good point about node.js. The issue I have with software in general is that it follows evolutionary principles even though it is an artificial construct. It reminds me of giraffes and their laryngeal nerve evolution. Unix is the laryngeal nerve of software.

                      1. 2

                        I’d argue that evolutionary(ish) thinking can apply to artificial constructs. See the Lindy Effect

                        1. 3

                          Yeah, I think the Lindy effect is applicable to Unix, as well as the shell specifically. I used this in my recent presentation about the Oil shell:

                          http://www.oilshell.org/blog/2019/01/18.html#toc_4

                          1. 2

                            I agree it can apply but that also means software tends to overfit incidental details the same way biological organisms optimize for their near term environment instead of optimizaing for potential far off contingencies like global warming.

                            In the case of unix it overfits the incidental details of the hardware and the performance profile of the hardware. More forward thinking would probably have given us micro/uni-kernels but they weren’t performant at the time so we are stuck with unix.

                            I’d say hardware and software is still too brittle and I don’t think in a 100 years folks are still working with unix but they’re certainly working with math and logic. So Lindy applies to math but I don’t think it applies to software because it’s too much like biology and not enough like math.

                            1. 2

                              Interesting. I think Lindy applies somewhat to software, but more to general concepts than individual implementations. The concept of “a car” will likely be around in some form in 100 years; a specific instance of a car is less likely to. CRUD systems are likely to exist in 50 years; this CRUD system I’m working on right now hopefully won’t.

                              I agree about the overfit problem, although sometimes optimising for the current environment is required to survive at all. In hindsight it’s easy to see that there were better options, but that’s only because we know the outcome. I wonder if constant local maximisation is the only way to robustly confront an unknowable future, for organisms and technology selection.

                              1. 2

                                I don’t know but Jeremy Kun has a really good article about the multiplicative weights update algorithm and in the article there is a link to a talk by Christos Papadimitriou about evolutionary optimization: https://jeremykun.com/tag/multiplicative-weights-update-algorithm/. Both the article and the talk are really good so I recommend it to anyone that wants to explore that stuff further.

                                1. 2

                                  Awesome, I’ve added that to my reading list. Thanks

                      2. 4

                        Anyone know how many flags and options find and friends have?

                        find is one of those crazy commands that has to call stat family calls on all files it encounters in order to deal with all the flags. My guess is that find exists as it does because separation into commands like fbymtime and whatever else, would thrash the file system even more than find already does, with little path to optimizing it with user space caches and such.

                        But, maybe that is no longer problematic with SSDs being the norm…

                        1. 5

                          I’m sure there are pragmatic reasons for why find and many other utilities break from “the unix philosophy” but I personally don’t think the unix philosophy is that great. The real fundamental principle is making composable tools and unix doesn’t have a monopoly on that.

                          1. 4

                            The way I think of find is as an expression language with a really bad syntax rather than a command line tool with flags.

                            Most flags aren’t really flags, but tokens in a recursive language. There ARE some flags, but they are easy to confuse with the expression language, and GNU find even gives you weird warnings if you mix them:

                            $ find -type f -maxdepth 2 |head
                            find: warning: you have specified the -maxdepth option after a non-option argument -type, but options are not positional (-maxdepth affects tests specified before it as well as those specified after it).  Please specify options before other arguments.
                            

                            -maxdepth is a flag, but -type f is really a clause in the expression language.


                            Some other comments on find:

                            https://lobste.rs/s/jfarwh/find_is_beautiful_tool#c_rkmlpz

                            FWIW since find is really an expression language, I’m looking to fold it into the Unix shell, and the first step (in the faithful Oil way) is to clone it:

                            https://github.com/oilshell/oil/issues/85

                            If anyone’s interested in working on this, let me know :) Find is very much like a tiny shell, because:

                            • it has its own regex syntax
                            • it has its own -printf syntax
                            • it starts processes with -exec
                            • it has recursive expressions with ( ) ! -a -o, as mentioned.
                          2. 2

                            Agreed, find is kind of awful to use.

                            1. 2

                              Oops, this reply was meant for your comment. I replied one level down instead.

                              1. 1

                                No worries. It still makes sense in that context as well.