1. 34
  1.  

  2. 19

    This article seems to confuse limitations of the shell with limitations of the operating system. stdout/stderr are operating system facilities that the shell is surfacing. The way these are combined from all child processes is how the operating system implements inheritance. If a shell launches a program and that in turn launches another program which generates output, the shell isn’t involved at all.

    It’s true that this can be reimagined into a more semantically rich environment, but doing so requires not only a new shell but new programs and a new communication mechanism. This is the approach Powershell took. There’s nothing wrong with the approach, but rewriting all of the useful tools that a shell wants to interact with is a large undertaking.

    So (shameless plug for where I work) if you want to have a semantically rich command line environment for manipulating your cloud computing deployment, there’s always Azure Powershell.

    1. 4

      I kept thinking “PowerShell has this” quite a lot while reading this. The object oriented nature of it makes it simple to select the data you want. The way the terminal displays objects is also available to be modified so an object can be much richer than what it initially displays. That makes it possible to actually work with the objects without having the cmdlets barf thousands of lines on your terminal when I only need to see a few of the properties of the objects.

      It’s still lacking a few of the features the author talks about; namely the automatic redirection and the mouse interaction (though you could pipe the command through Out-GridView with -Passthrough to the new cmdlet), but in my opinion it’s a very good step in the new direction.

      The only thing I despise doing in PowerShell is writing complex modules as it’s way too easy to shoot yourself in the foot. Since all values are in a global namespace they get changed around a lot and you never really know what is going to happen when you import something. There’s also a very irritating limitation on importing classes in the older versions of the shell, but that might have been fixed in a later version (but the problem then becomes targeting the version of the shell and so on…).

      It’s a gigantic leap in the right direction, but it’s still lacking quite a few features before it becomes as comfortable to work in as the author imagines a shell could be.

      1. 1

        The main idea is not to have text dump but instead objects on the screen that you can interact with. I don’t think PowerShell has it. I didn’t see it anywhere in the documentation.

        quite a few features before it becomes as comfortable to work in as the author imagines a shell could be.

        I know there is a lot of work but what you mean here?

        1. 3

          The main idea is not to have text dump but instead objects on the screen that you can interact with. I don’t think PowerShell has it. I didn’t see it anywhere in the documentation.

          You are right that there is no built in feature like that in PowerShell, but you can insert Out-GridView in your pipeline to get something similar, so for example Get-ChildItem | Out-GridView -Passthru | Remove-Item. That will pop up the result of Get-ChildItem and let you select the item you want to delete. This isn’t the same as what the author wants, and I really want what the author is imagining, but it does give you a somewhat similar workflow.

          I know there is a lot of work but what you mean here?

          Well, lets see then.

          • PowerShell already has the ability to understand the input and output from commands (Errors are set to the $Error variable to inspect later if you are interested. Out-Host and Write-Warning allows you to write informational text in the terminal that is to be ignored when processing the pipeline.)
          • The formatting of objects being output on the screen can be modified by a ps1xml. That gives you some control of how much text is going to be barfed out in your terminal, but not perfect control. An automatic application of less when the shell sees that there will be a lot of output would be nice.
          • Automatic redirection isn’t a thing. You could have made a job to do it, but this is the same as what you’d have to do in bash to not have your terminal lock up because the command is still being processed.
          • The semantic understanding is still lacking. While you could use Out-GridView as I suggested earlier, you still have to manually copy-paste things around instead of just clicking on things to make things happen. Imagine how awesome it would be if all object methods would automatically become a dropdown list element? Using the mouse or keyboard to select a previous object? All of that kind of understanding is lacking from the shell.

          So yeah… There’s still a lot missing until it will be as good as the author is imagining, but in my opinion it’s a giant leap from bash.

          1. 1

            I see. Thanks!

          2. 3

            Have you used Lisp Machines, Userland, or Mathematica? They have a lot of what you would want, like command lines (and documents made of command lines) with rich objects embedded. (Some environments that get close, but not quite would include Domain/OS pads, MPW, Emacs…)

            Honestly, I think a lot of this could be done pretty easy with web technology…

        2. 1

          I haven’t tried Azure (or Azure Powershell) - does it have the user interface features of Windows Powershell ISE? I think the power of the semantic-objects approach is to build additional UI and interaction on top of the object model. Without that interactivity, there’s not much difference from shell tools that just spit out easy-to-parse text streams.

          1. 1

            Azure Powershell is a collection of cmdlets to manage Azure that expose Azure concepts as objects. Those cmdlets run in Powershell. You can run Powershell as part of ISE (or anything else if you want); the two are fully complimentary.

            1. 1

              Ah - I assumed this was like Google’s Cloud Shell that gives each account in the cloud service a semi-ephemeral shell in a browser window.

              1. 1

                I think what you are describing would be Azure Cloud Shell (Imaginative name…). Azure PowerShell is simply a module you can install on your computer via the built in module manager in PowerShell. Simply type Install-Module -Name Az and it’s installed. Might need admin permissions unless you install it to a user context.

                1. 1

                  Azure Cloud Shell is the hosted semi ephemeral shell in a browser window. Files you write in your home directory in it are persisted but processes you start close when you close the terminal, AFAIK.

                  If you want to run PowerShell on not-Windows, there is PowerShellCore.

            2. 1

              If a shell launches a program and that in turn launches another program which generates output, the shell isn’t involved at all.

              Correct.

              I had another use case in mind: prog1 & and then prog2 &. Result: dumpster on your screen, combining outputs of both, while the shell could give these two programs different file descriptors and handle the output from there. Non-trivial but sounds feasible.

              new programs and a new communication mechanism.

              Yes. Meanwhile I plan:

              • Wrap most commonly used. The wrappers would provide semantic information.
              • Make this wrapping easy so that others could solve their own “problems”.
              • Maybe make some alternatives to existing utilities.
              1. 1

                I had another use case in mind: prog1 & and then prog2 &. Result: dumpster on your screen, combining outputs of both, while the shell could give these two programs different file descriptors and handle the output from there. Non-trivial but sounds feasible.

                Fair enough. In my spare time I’m writing my own shell, and I had the same thought about the trailing “&” operator not being ideal for today’s use. Typically when invoking a command in the background I don’t want to see its output. So I made a “&!” operator which runs it in the background and captures the output back to in-memory buffers owned by the shell, where stdout and stderr have different buffers. This means it’s possible later on to go and inspect the contents of a buffer if something went awry, but it won’t randomly preempt my next action.

                1. 1

                  I’ll take a look at your shell.

                  I was thinking about small text box for commands where one can see last few lines (updated as output is produced) and the box can be focused and expanded.

            3. 13

              Keep text editing pure! Any semantic understanding by the text editor is undesirable, other programs should handle that. You don’t want to complicate the text editor.

              ^^ Ever heard of Acme/Sam?

              Why use a car when bicycle is so much simpler?

              I think this point is quite relevant. I much prefer bikes over cars, because most of the time I don’t have to 50km+, and just hopping on a bike is easier to “setup” (just sit on the bike), gives me more freedom (I can get off and push it through a pedestrian zone), and doesn’t quite me to spend several hundreds of $$$ to be allowed to drive it – let alone own/maintain it. I don’t want software to be like cars, at least when it doesn’t have to.

              The shell is broken, and saying that is an understatement. But at the same time, it exists in a real context (that being all the tools and programs that happen to run on *nix systems), within it’s not really feasible to just fix the shell. All attempts to compromise have failed.

              1. 7

                I think this sort of discussion reveals that minimalism, no matter what kind, can’t just be self-fulfilling but purposeful to be useful. And not just purposeful, but purposeful when contextualized; the justifications for computing minimalism in the 70’s aren’t usually so purposeful several decades later when the conditions that spawned those decisions no longer exist. Although that isn’t to say that the “dumb” shell doesn’t make total sense, it’s simply within the framework of specific (and sometimes outdated) requirements and decisions that it does. To quote Donald A. Norman’s foreword in The UNIX Hater’s Handbook:

                Unix was designed for the computing environment of then, not the machines of today. Unix survives only because everyone else has done so badly. There were many valuable things to be learned from Unix: how come nobody learned them and then did better?

                When it comes to the minimalism provided by bikes, I can’t help but also think that in our world today we could really use the “less” of bikes and benefit more from that. But at the same time I look at the urban design of cities along the sunbelt and think dreadful thoughts of cyclist lives lost trying to navigate sprawling, hostile suburbs. Some things may not be considered minimalist because there’s no real problems in choosing to do those things, since minimalism is universally, I believe, something that is difficult. But of course we shouldn’t give up pursuit of things for how difficult they may be.

                1. 2

                  Need to look into acme/sam. Thanks! This might take a while.

                  Yes, everything should change. Meanwhile I was thinking about interoperability with existing mess (with degraded functionality), wrappers, and gradual creation of alternate utilities.

                  All attempts to compromise have failed.

                  I don’t understand here. Can you please elaborate?

                  1. 2

                    Need to look into acme/sam. Thanks! This might take a while.

                    It’s not that complicated. The concepts revolves around just what you mentioned, being an “integat-ing” (as opposed to an “integrat-ed”) editor. Russ Cox has a video here, and it’s also mentioned in this introduction to Plan 9.

                    I don’t understand here. Can you please elaborate?

                    A claim I have made before is that it’s just feasible to really overcome the limitations of contemporary operating systems (ie. Unix and Unix-likes) and their paradigms within the system itself. If you try to “make unix nicer to use”, you’ll eventually either loose what you wanted to keep, become a fantasy (eg. Plan 9), or your system will degrade to a point where it’s basically just as bad, just as much of a problem. Or so it seems.

                  2. 1

                    Took a look at Acme. Very interesting approach.

                    What I had in mind for “integrating” in Next Generation Shell was something resembling plumbing rules. I planned to parse the output of running commands to get “real” objects on the screen.

                    I guess I need some time to digest this and see which other ideas I can steal^W be inspired by.

                  3. 6

                    One major argument I’ve heard is that in many environments you can’t bring your own shell. I don’t think this is the best argument, but it is also the argument I’ve heard for being proficient in vi and awk since they are tools you can guarantee exist on most *nix systems.

                    1. 0

                      Coming from startups background, I’m solving the problem for this environment first. No SunOS and the shell runs locally most of the time or at least on servers that you as Ops person (main target audience) control.

                      1. 1

                        I think that’s reasonable. Mind you anyone that has given me that argument, I’ve generally told just scp a static binary if it’s that urgent.

                        One thing I would like is some GIFs of what you have in mind in action.

                    2. 5

                      Hey, I like your ideas about innovating in the shell/terminal (which I’ve ranted about a bit myself: https://matklad.github.io/2019/11/16/a-better-shell.html)!

                      1. 3

                        OK. That will take some time to digest. Quite a few ideas there.

                        1. 1

                          There is quite a big overlap in thinking with https://matklad.github.io/2019/11/16/a-better-shell.html#new-shell

                          • “A GUI application” - I’m planning on Web UI and text UI, which will both have very similar functionality. (As opposed to GUI application)
                          • “UI framework for text-based UIs” - still thinking about that
                          • “tilling frame management”. Not sure yet but there must be some kind of management.
                          • “Some concept of process-let, which can occupy a frame” - thinking in same direction
                          • “A prompt, which is alway available, and smartly (without blocking, splitting screen if necessary) spawns new processlets.” – yes!!
                          • “An API to let processlets interact with text UI.” - yes
                      2. 4

                        I like the UI although I despise that the demo runs in a browser.

                        A fundamental problem: Do you expect a command like “vim” or “tmux” to work as expected? If no, the lack backwards-compatibility makes the shell undesirable. If yes, you inherit all the baggage of terminals.

                        1. 1

                          I was planning on distinguishing programs that need terminal and the ones that just dump the output. For compatibility NGS will need to deal gracefully and make the programs that need terminal work.

                        2. 2

                          How does it deal with stdin and stderr mixed up then? For example find /etc -name "net*" gives me some results on stdout with a few “Permission denied” (as non-root) on stderr in between.

                          1. 1

                            I see I was quoted. I still believe what I wrote there. The shell is and should be a simple program, used only to launch other programs and for the simplest of scripts. Perfect example, here is a post from today:

                            https://unix.stackexchange.com/questions/560697/how-to-reverse-shell-arguments

                            Look at some of the (well upvoted) answers:

                            flag=''; for a in "$@"; do set -- "$a" ${flag-"$@"}; unset flag; done
                            

                            and:

                            eval "set -- $(awk 'BEGIN{for (i = ARGV[1]; i; i--) printf " \"${"i"}\""}' "$#")"
                            

                            Looks complicated? It is. And it should be. This stuff should not be easy in shell, because if it was, then people will want to do it. For years I tried to do as much in shell as was possible, twisting my scripts into knot in order to maintain this mythical thing called “POSIX compliance”. A much better thing is to do it using a programming language. Here is the same thing in PHP:

                            $a2 = array_reverse($a1);
                            

                            See the difference? And you dont have to worry about being portable with 5 different shell implementations, you just do whatever is specified by the (insert programming language). Shell users, beware Zawinski’s law:

                            Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.