1. 34
    1. 8

      I like this post, and (as usual) the newb-friendly yet still technically useful ethos of Julia’s posts…

      I didn’t respond elsewhere because I’ve been using cmdlines too long, but one thing that occurred to me just now was that virtual machines and/or Incus (formerly LXD) style system containers are a great way to get comfortable with command line stuff. just fart around, break stuff, su root, nothing matters!

      I also think everyone should install sl and gti and read a few xkcd jokes about remembering the args for tar and realize that the command line is weird and funny as well as being powerful, and IMO anyone who claims to be 100% comfortable using it is just asking for their comeuppance. (ask me about writing passphrases to a file with echo instead of echo -n sometime).

    2. 5

      Desperation: UNIX command line, with escape codes, ioctls, signals, kernel brokering between pty pair, is an abysmal foundation for building user-facing tools. There are no real alternatives today though.

      1. 2

        There are no real alternatives today though.

        No GUIs?

        1. 4

          I think many companies would like their IDE to be the center of the dev universe, so all tools plug into it … but I’m glad that hasn’t happened!

          Instead the IDEs all embed a shell, so you can use the tools elsewhere, without an IDE, including in your CI, etc.

          1. 3

            The Unix shell isn’t the only way to do composition. When you make a hammer, you think everything should be nails. We had the seeds of reusable GUI components in the 90s.

            1. 2

              Reusable GUI components aren’t quite the same as composable GUIs. What would a composable GUI even be?

              Without having thought about it very hard, CLI actions can be arranged in either time or space, but GUI actions can only be arranged in time, so it’s not really possible to prepare a sequence of them. You can make GUI applications really good at sending their output to each other, as mobile apps are, but it’s not obvious to me how you could take that much further.

            2. 1

              I’m not claiming it is!

              And I would like GUIs to be involved, we have a #shell-gui channel for that - https://www.oilshell.org/blog/2023/06/release-0.16.0.html#headless-shell-screenshots

              Shell is definitely old and “not enough”, there are about a billion things that can be improved about it

              When I say “shell” what I really mean is that your data is long-lived, and you have different tools that operate on it. Not that you buy into an entire app ecosystem, e.g. living in the Microsoft Word universe or Adobe Photoshop universe, where the data is mostly proprietary

        2. 3

          Sadly, upon closer examination, it turns out that GUI programming does not exist:

          https://blog.royalsloth.eu/posts/sad-state-of-cross-platform-gui-frameworks/

          Terminal, with all its drawbacks, is at least a stable API with multiple vendors!

          More seriously though, GUI is great! For dev tools though, you also need out-of-process composability (so that you can write small tools and combine them) and text heavy interface (because fuzzy text search+auto completion is the most high-bandwidth way to copy data between the brain and the CPU). We don’t have this specific program/set of APIs yet, but they could exist.

          The problem of GUI not existing is real though! If I want to do a cli app, I can, in any language, just add ansi escapes library, write a short program, and it’ll work. Similarly, it almost as trivial to write a program which listens on a socket and responds with HTML. You can’t do the same GUI, on my Nix, it’s a pain even to run anything GUI-related because it wants to link to libx11 or libGL, or wayland.

          1. 1

            For anyone who has read this linked article in the past, the information re: GTK is completely out of date. Glade is no longer recommended and does not support the newest era of widgets. Vala has been mostly ditched in favor of Rust. If you’re writing a cross-platform application today it is quite a good choice IMO – assuming you agree with or can tolerate their UI prescriptions.

          2. 1

            Yeah definitely, IPC is a lot more composable and polyglot than shared libraries

            I think there is a pretty simple way to get there – a shell GUI + light protocols for CLIs to print HTML (simple tables go a long way, simple images, proportional width fonts!)

            (The renderer would maybe be something like libweb from LadyBird, so as to avoid the Chrome syndrome)

            A bunch of terminals support sixel, but it’s perhaps not ambitious enough - https://en.wikipedia.org/wiki/Sixel

            A bunch of interesting projects here, some defunct - https://github.com/oilshell/oil/wiki/Interactive-Shell

            e.g. https://github.com/unconed/TermKit

      2. 1

        cmd.exe and PowerShell are both non-Unix and pretty popular.

        But it seems like Microsoft has mostly pushed WSL in recent years, which is Unix. (It’s Ubuntu on top of their Linux kernel emulation, as far as I understand.)

        I think you’re lamenting that narrow waists can inhibit innovation - https://www.oilshell.org/blog/2022/03/backlog-arch.html#characteristics-of-narrow-waists

        Even if you have billions of dollars like Microsoft, and are invested in your own ecosystem, you can’t escape that inertia. That is, every time somebody writes a CLI in Rust or Go, it makes the shell more valuable

        To be a broken record, Oils headless mode is a good way of de-emphasizing the terminal, and it has a good shot because it respects the inertia of the narrow waist (you’re not going to rewrite all your Rust/Go/C tools) - https://www.oilshell.org/blog/2023/06/release-0.16.0.html#headless-shell-screenshots

        But we don’t have people to work on it right now - I’m focused on YSH (which is also saner about string escaping)

        1. 2

          PowerShell has some nice innovations but despite sitting in front of a Windows box every day I never quite got used to using it. (Most of my real work is via ssh to a Linux box.) I use WSL all the time.

          1. 1

            Same here, and PS is still slow when it comes to parsing large textfiles. I used to run Perl under Cygwin when I needed to parse files for work, and now it’s even easier with WSL.

            My coworker does a lot of wizardry with PS though, like manipulating registry entries etc.

        2. 1

          But it seems like Microsoft has mostly pushed WSL in recent years, which is Unix. (It’s Ubuntu on top of their Linux kernel emulation, as far as I understand.)

          It can be any distro. WSL2 uses the same infrastructure as Linux Containers on Windows (LCOW), it’s a Microsoft-provides kernel and a third party userland. You can install distros from the App Store.

      3. 1

        I don’t really know if the output-level escape code stuff is a big deal (I mean it’s abstracted away right?), though I think the “everything is lines of text” stuff feels way more isolating to people coming from structured programming. Stuff like NuShell or PowerShell kind of get at that.

        It’s hard to really overstate that many command line shells work well because they allow people to quickly type in what they want when they start getting good at the tool. And so far I feel like pipes have proven themselves to be … quite effective at moving around data (it’s really a surprise every language doesn’t have it at this point). But I bet you could make a really interesting and powerful terminal emulator that just had way more metadata about the programs it’s running. Stop making me think about escape sequences!

        1. 1

          Yeah, the escape sequences are not that problematic, they are “just a markup”, albeit a bad one. The problem is, escape sequences are only a part of the interface here. For example, to decide whether to use ANSI escapes, program use isatty function, which is a syscall! So you can’t make a terminal without kernel involvement. And then there’s SIGWINCH!

          A separate major problem is that the shell and subordinate processes all use the same file descriptor, which creates shared cross-process mutable state and prevents output virtualization.

          And so far I feel like pipes have proven themselves to be … quite effective at moving around data

          Pipes are OK, but I am pretty sure we can do better. For example, I want to see incremental results as I type the pipe in. Like, when I ctrl+F in my editor, I don’t enter a search string, and then hit enter, I get the results immediately after I type the first search character. I want the same experience with pipes (more generally, I think ideally “editor” and “shell” are just the same interface)

          1. 1

            I actually was working on an inspectable pipeline tool in the past (main idea would be to place it at a certain pipe level and then be able to incrementally edit something like a sed/awk pipeline).

            I do think you can get a lot of the way there with existing tooling, but for me the big difficulty is the lack of a “window manager” with terminals. Too many times I’ve wanted to write a script that spawns N terminals that set things up and you don’t really have that option. But there are creative people in the world

            1. 1

              Yeah, lack of window management gets to the heart of it! There’s just one terminal fd shared by all the processes, so there’s just one “window” (I guess this stems from there being a single terminal originally?).

              I think people usually shell out to tmux to solve this in scripts, but then it is tmux-specific externally, and even more hideous tower of emulation internally.

              And yeah, as I work on a distributed system, writing s script which spawns N processes such that each gets a separate output stream would be golden!

    3. 4

      Being ~restricted to the CLI while working in Trustix and later Linux From Scratch really incentivized learning how to do things.

    4. 3

      For people who like O’Reilly books, I can vouch for the Linux Pocket Guide being a decent reference on what-to-use and Effective Linux at the Command Line as being a good “how do I do construct more complicated things on the fly.

      Note that I did technical review on both of them for the author, Daniel Barrett. So you can also blame me in part.

    5. 2

      For me, it was the discovery that I don’t have to remember all this stuff – Shell is a programming language and has functions!

      So I start by just copying and pasting commands into a file – notably a list of N commands in a text file is a valid shell script

      Then I copy them into functions and make names for them. Now I don’t have to remember the syntax, or type it – it has a name

      And then gradually through testing/observation/reading, I learn what the commands actually do! It’s very fast to just poke at the text file, run it, and see what happens

      Often I am confused for months about something, and then magically by osmosis, running the shell commands in different contexts reveals how it works. I do little explicit learning these days, for most things

      The shell makes the process of learning very smooth and efficient, in my experience. ( Shell itself famously has a bunch of pitfalls, but I found that understanding some kernel basics helped me differentiate between the shell, and the CLI tools it invokes, which isn’t trivial.)


      e.g. here is a script I wrote trying to get an alternative allocator to work:

      https://github.com/oilshell/oil/blob/master/benchmarks/mimalloc.sh

      Other experiments with perf tools - https://github.com/oilshell/oil/blob/master/benchmarks/systemtap.sh

      https://github.com/oilshell/oil/blob/master/test/ltrace.sh

      Many of these scripts get abandoned, but that’s OK because they didn’t take very long to write. And now I have a record of what I did, if I ever have to go back to them

      But many of them become invaluable automation too – our whole CI is built on shell and has a pretty good interface now - http://travis-ci.oilshell.org/github-jobs/4682/

    6. 2

      I like the fish avocation. Moving to fish several years ago for my interactive shell (tho scripting ain’t half bad tbh), has made a big difference as the autocomplete is just nicer with command descriptions & what-have-you. With fish ‘catching on’, completions are now readily available. It’s not a fair comparison, but I had moved from unconfigured Bash (no features) to Oh My Zsh after a coworker recommended & while it had a rich feature set, it was slow to start up & do anything (due to the plugins system) whereas fish was nicer out of the box, very quick, & to this day I have 1 plugin installed & personalized my prompt, but the rest is stock.

    7. 2

      The core issue that prevented me from getting comfortable with the command line was always lack of discoverability. Coming from Windows, I was used to being able to explore almost the entire system by just clicking around. The command line doesn’t give you that. Sure, you can easily find out how a given command works, but if you don’t know what’s there, if you don’t know the name of some command that you need, then the command line is just a useless black box for you to stare at.

      In the end, what helped me get -somewhat- comfortable with the command line was an strong dislike for anything Windows post 98SE, getting RSI from using a low-quality non-ergonomic mouse, and 24/7 access to web search by finally getting a somewhat decent internet connection back in the day.

    8. 1

      a fancy file manager

      A few people mentioned fancy terminal file managers like ranger or nnn, which I hadn’t heard of.

      Me neither. But I will say Midnight commander (mc). As I have seen people beginning with GNU/Linux and terminal, mc helped them a lot. There is a psychological barrier and fear and this blue-white user interface helps to cope with that. They learn the rest on the fly.

      Newbies often get lost even in the directory structure… and tools like mc help (I have same experience with nc many years ago)

      Another important thing is to write notes and build your own knowledge base. Bookmarked how-tos, articles and discussions are nice, but not as good as your own notes written in your own words with examples crafted right for you. It does not depend on your experience level - such notes are useful in any point of your career. And apply version control (Mercurial, Git etc.) on such knowledge base.

      Part of that knowledge base will be executable scripts or small programs (few sources + Makefile) that you can link from ~/bin or add to $PATH.

    9. 1

      I’m old enough that I didn’t have much of a choice- Commodore 64 was my first computer. =)

      1. 2

        (no need to reply if you don’t remember, or if you’ve been using the command line comfortably for 15 years — this question isn’t for you :) )

        (≖_≖ )

        1. 3

          Ha, fair.

          In that case I can say a few things.

          1. Most shells aren’t significantly better enough than Bash (that includes zsh, fish, etc.) to merit a switch, since you’ll still have to learn Bash anyway.

          2. https://starship.rs/ is a great way to customize your command line, and it’s fairly fast as it’s written in Rust.

          3. There are many options for autocomplete, there’s blesh, there’s also just setting set show-all-if-ambiguous on and TAB:menu-complete in your .inputrc plus there exist context-specific autocompletions for things, like for Git.

          4. Some terminals, like my current fave (wezterm), can interpret sixel graphics, which means you can ask it to display image data right in the terminal. The imagemagick “convert” utility has a sixel output option.

          5. Oil shell looks interesting. It’s backwards-compatible with Bash, but fixes many of Bash’s warts while still being Bashlike. https://www.oilshell.org/

          6. If you like functional languages, you’d probably like es-shell https://wryun.github.io/es-shell/ although it currently doesn’t have much critical mass. I have a fork of it (which I need to update with upstream changes and haven’t yet), it’s pretty cool, and also pretty old… but still relevant IMHO

          7. Awk is underrated.

          8. Instead of remembering the precise incantation for something, wrap it in a function and make it part of your shell.