1. 31
    1. 7

      On the confusion of what a scripting language is, there is a ton of overlap and it’s not always clear which to choose.

      A lot of my own projects started out as bash/zsh because it seemed small and the project fit a shell script well. But as they grow and become more critical, I regret not having things like tests and more robust error handling. Several times I tried starting such projects in Python, but converted to Bash when I found I wasn’t making as much progress as I wanted.

      I recall that Git also started out as a collection of Bash & Perl scripts, which is why it took so long to port to Windows.

      1. 2

        If that bit about Git is true, it isn’t in its own git history. The first commit is a few c files.

        1. 9

          I never questioned it because my terminal would report the current process as bash or Perl most of the time during a checkout

          Edit: I found it. Linus describes commit as being a few calls to Linux utilities, and push as being just rsync.

      2. 1

        There are a couple of different testing tools for shell, FYI.

        1. 2

          I have written my own simple test runner (example). The actual tests go into a test file (such as this). It works in a very simple way. If the test function fails with a non-zero exit code, it is a test failure, otherwise it is a success. It has served me pretty well so far.

          Also, since I don’t target bash alone but target sh instead, so that the scripts can run on more number of shells, I run the tests with bash, ksh, and zsh on Debian and Mac as well as with dash, posh and yash on Debian (example) to weed out any Bashisms.

          1. 2

            Nice work. It’s good to see someone else considering shell portability too.

            While I generally target sh, I’ve added a configure argument to set the target shell, which gets used for invocations throughout the Makefile. A little wrapper script then loops through, tests in a series of shells (currently only dash, bash and mksh though), runs the test suite, and also compares the sha1 sums of all built files. The last part is mostly because the project is a shell library/toolset and part of that is a tool to “build” more distributable scripts from a source project, which it uses to build itself.

        2. 1

          I’m aware of bats, and by relation, the Test Anything Protocol. I think I’ll be digging into both of those a little more in the new year.

          1. 2

            There’s also https://github.com/kward/shunit2/, which I find quite helpful.

    2. 3

      Thanks for posting. This articulates something my own mind has been circling since reading those same threads (though I had connected fewer of the dots, so this was illuminating).

      I’d be curious to hear from anyone who concurs with the central premises of this post but has replaced their use of a more traditional shell (e.g. ksh, bash, dash, zsh, or their predecessors) with something like fish, xonsh, oil, etc.

      If you are not particularly concerned with needing the shell as a tool readily at your disposal when you connect to unknown and arbitrary remote systems, but only as a tool to converse with your primary workstation (as a “power user” first and a sysadmin only by its practical usefulness), how does the ground shift around what is valuable to invest your time and knowledge into?

      1. 4

        I’d be curious to hear from anyone who concurs with the central premises of this post but has replaced their use of a more traditional shell (e.g. ksh, bash, dash, zsh, or their predecessors) with something like fish . . . how does the ground shift around what is valuable to invest your time and knowledge into?

        👋 I’ve been using Fish for about 8 years, now, I guess. I’ve always had an intuitive understanding of “the shell” that’s aligned with the description in this post. That is: a terse way to orchestrate interactions with the OS — typically, one interaction at a time. But I can’t say that I make a deliberate effort to learn anything about it, because I’m almost always task-driven.

        My usage is usually iterative: run this test. Okay, now run it with the following env vars set, to change it’s behavior. Now again, capturing profiles, running pprof, and rg’ing for total CPU time used. Now again, but add a memory profile. Now again, but output all of the relevant information in a single line with printf. Now again, but vary this parameter over these options. Now again, but vary this other parameter over these other options. Now again, repeating everything 3 times, tabulate the output with column -t, and sort on this column. Oops, tee to a file, so I can explore the data without re-running the tests.

        Each of these steps is hitting up-arrow and editing the prompt. Fish is a blessing because it makes this so nice: the actual editing is pleasant, and the smart history means even if I don’t save this stuff in a file, I can easily recall it and run it again, even months later, with no effort.

        I don’t know if this actually answers your question… maybe it does?

      2. 2

        During my last year as an undergrad, 2004-2005, I used a Perl-based shell. It was very much like a REPL: both a REPL for Perl and a REPL for Linux. I loved it. I don’t recall why I stopped using it, though the reason is probably as simple as “I lost my primary workstation to a burglar who took almost everything of value”. I was also starting to migrate away from Perl at the time. At the time, it was great, because my deep knowledge of Perl was directly translatable to shell use.

        What I’d really love is scsh with interactive-friendly features.

        1. 4

          What I’d really love is scsh with interactive-friendly features.

          Hi, I’m the author of Rash. It’s a shell embedded in Racket. It’s designed for interactive use and scripting. It’s super extensible and flexible. And I need to rewrite all of my poor documentation for it.

          Currently the line editor I’m using for interactive Rash sessions leaves a lot to be desired, but eventually I plan to write a better line editor (basically a little emacs) that should allow nice programmable completion and such.

          Also, salient to the OP, job control is not yet supported, though that has more to do with setting the controlling process group of the host terminal. You can still run pipelines in the background and wait for them, you just can’t move them between the foreground and background and have the terminal handle it nicely.

          Replying more to the parent post, for the few scripts that I really need to run on a system that I haven’t installed Rash on, I write very short scripts in Bash. But realistically I just treat Rash as one of the things I need to get installed to use my normal computing environment with extra scripts. And a lot of scripts are intended to run in a specific context anyway – on some particular machine set up for a given purpose where I have already ensured things are installed correctly for that purpose. Writing scripts in Rash instead of Bash is nice because my scripts can grow up without a rewrite, and because as soon as I hit any part of the program that can benefit from a “real” programming language I can effortlessly escape to Racket. Using Rash instead of plain Racket (you could substitute, say, Python instead if you want) is nice because I can directly type or copy/paste the commands I would use interactively, with the pipeline syntax and everything. In practice, Rash scripts end up being a majority of normal Racket code with a few lines of shell code – most scripts ultimately revolve around a few lines doing the core thing you were doing manually that you want to automate, with the bulk of the script around it being the processing of command line arguments, munging file names, control flow, error handling… lots of things that Bash and friends do poorly.

          1. 1

            Thanks for making this, and for pointing it out here!

        2. 3

          I came to open source and the scripting world first through Perl, and that journey taught me about, and more importantly to think “in” data structures such as arrays and hashes, and combinations of those. For that I’ll be ever grateful - plus the community was absolutely wonderful (I attended The Perl Conference back in the day, and was a member of London Perl Mongers). Now I’m discovering more about the shell and related environment, such as Awk and Sed, I’m looking at Perl again through different eyes (as in some ways it’s an amalgam of those and other tools).

      3. 1

        Thanks, yes this has been brewing for a while in my head, and I finally found the opportunity to write it down. I would also be curious to hear from folks about what you say above, for sure. Always learning. Cheers.