1. 109
    1. 50

      Regardless of whether you currently think your existing tools need replacing, I urge you to try ripgrep if you haven’t already. Its speed is just amazing.

      1. 7

        I’ll second this sentiment. Your favored editor or IDE probably has a plugin to use ripgrep and you should consider trying that too.

      2. 6

        As an experiment I wrote a tiny Go webservice that uses ripgrep to provide a regex aware global code search for the company I work at. The experiment worked so well over a code base of ~30GB that it will probably replace hound which we use for this purpose at the moment. I did not even use any form of caching for this web service, so there is still performance to squeeze out.

        1. 5

          https://github.com/phiresky/ripgrep-all comes with caching, it’s a wrapper around rg to search in PDFs, E-Books, Office documents, zip, tar.gz, etc

      3. 3

        ripgrep and fd have changed the way I use computers. I’m no longer so careful about putting every file in its right place and having deep, but mostly empty, directory structures. Instead, I just use these tools to find the content I need, and because they’re so fast, I usually have the result in front of me in less than a second.

        1. 5

          You should look into broot as well (aside, it’s also a Rust application). I do the same as you and tend to rotate between using ripgrep/fd and broot. Since they provide different experiences for the same goal sometimes one comes more naturally than the other.

          1. 2

            broot is sweet, thanks for mentioning it. Works like a charm and seems super handy.

      4. 1

        3 or 4 years ago it was announced that the vs code “find in files“ feature would be powered by ripgrep. Anyone know if that’s still the case?

        1. 1
    2. 28

      3 of the 17 tools are from @sharkdp.

      He wrote:

      hexyl could also be added to this list as a replacement for xxd.

      1. 4

        bat is great; it is used in neuron to provide search preview on the console along with fzf and ripgrep.

      2. 1

        Why did they pick new names?

        Wouldn’t it be possible (and simpler for migration) to replace the existing tools instead, like e. g. BSD did when they replaced the Linux tools with their self-implemented ones? (Just in this case for safety, not ideology.)

        1. 17

          It can be tricky to replace the standard ones since third party scripts might depend on a particular implementation detail you missed in your clone.

        2. 11

          Isn’t BSD older than Linux, and based on actual Unix? Maybe GNU is a better example of replacing the original Unix tools.

        3. 5

          That would probably be something like this: https://github.com/uutils/coreutils

        4. 3

          The Linux utilities were named after the BSD ones. The BSD ones share a history with the AT&T Unix ones. Sometimes the Linux ones are disambiguated with a “g” prefix (for “GNU”).

        5. 2

          To allow breaking changes to API (which for a CLI is just flags and behaviour).

    3. 23

      I just want to say that I am excited about Rust, it’s gotten me interested in programming again, and the tools on this list have gotten me excited about the command line for the first time in years. I recently switched back to Linux from macOS after more than a decade of loyalty to Apple, in part because I’m currently more excited about projects like these than I am about Apple’s GUIs.

      @sharkdp, @burntsushi, and everyone else working in this space, thank you for your efforts, and don’t let the haters get you down.

      1. 18

        Much appreciated. :-)

      2. 14

        Thank you for the feedback!

    4. 16

      I’ve asked this the last time, but does anyone know why “rewritten in rust” and “overuse of colour and emojis” correlate? I have no need to switch from coreutils, but as someone who disables colours in my terminal sessions, I wouldn’t even want to (with the exception of ripgrep, where I get the technical advantage over rgrep).

      1. 32

        I kind of think that “overuse of color and emojis” is a bit of an oversimplification, but I take your meaning. Or at least, I might say, “more thought and care given toward the overall user experience.” However, you might disagree with that, since you might think that colors and emojis actually make the user experience worse. (Although, to be honest, I’m not sure how much these tools use emojis.) With that said, I think it’s at least reasonable to say that many of the “new” tools (and not just Rust) are at least paying more attention to improving the overall user experience, even if they don’t actually improve it for every user. For example, I know for ripgrep at least, not everyone likes its “smart” filtering default, and that is absolutely a totally reasonable position to have. There’s a reason why I always include smart filtering in every short description of ripgrep; if you aren’t expecting it, it is not only surprising but frightening, because it violates your assumptions of what’s being searched. It’s a disorienting feeling. I know it all too well.

        As for why this is happening, I’m not sure. If we wanted to get hand wavy about it, my personal take is that it’s some combination of lower barriers to entry to writing these kinds of tools and simultaneously providing more head space to even think about this stuff. So that means that you not only have more people entering the space of writing CLI tools, but you also have more breathing room to pay attention to the finer details of UX. This isn’t altogether surprising or unprecedented. Computing history is littered with building new and better abstractions on top of abstractions. As you move higher up the abstraction ladder, depending on the quality of said abstractions, you get more freedom to think about other things. This is, after all, one of the points of abstraction in the first place. And Rust is definitely an example of this IMO. And it’s not just about freeing yourself from worry about undefined behavior (something that I almost never have to do with Rust), but also about easy code reuse. Code reuse is a double edged sword, but many of these applications shared a lot of code in common that handle a lot of the tricky (or perhaps, “tedious” is a better word) details of writing a CLI application that conforms to common conventions that folks expect.

        I also don’t think it is the only phenomenon occurring either. I think building these kinds of tools also requires tapping into a substantial audience that no longer cares (or cares enough) about POSIX. POSIX is a straight jacket for tools like this, and it really inhibits one’s ability to innovate holistically on the user experience. The only way you can really innovate in this space is if you’re not only willing to use tools that aren’t POSIX compatible, but build them as well. My pet theory is that the pool of these people has increased substantially over the past couple decades as our industry has centralized on fewer platforms. That is, my perception is that the straight jacket of POSIX isn’t providing as much bang for its buck as it once did. That isn’t to say that we don’t care about portability. We do. There’s been a lot of effort in the Rust ecosystem to make everything work smoothly on Linux, macOS and Windows. (And POSIX is a big part of that for Unix, but even among Unixes, not everything is perfectly compatible. And even then, POSIX often doesn’t provide enough to be useful. Even something as simple as directory traversal requires platform specific code. And then there’s Windows.) But beyond that, things drop off a fair bit. So there’s a lot of effort spent toward portability, but to a much more limited set of platforms than in the older days. I think the reason for that is a nearly universal move to commodity hardware and a subsequent drop in market share among any platform that isn’t Windows, macOS or Linux.

        Sorry I got a bit rambly. And again, these are just some casual opinions and I didn’t try to caveat everything perfectly. So there’s a lot of room to disagree in the details. :-)

        1. 7

          Just to provide feedback as a user of ripgrep, xsv, bat, broot. I have experienced no annoyance with respect to colourization or emojification of my terminal emulator. If I had to hypothesize, I think easy Unicode support in Rust allows people to embed emojis so they do.

        2. 4

          The key is overuse. Some colour can sometimes be very helpful! Most most of these tools paint the screen like a hyperactive toddler instead of taking the time to think of what would improve the user’s experience.

          1. 26

            taking the time to think of what would improve the user’s experience

            I addressed this. Maybe they have taken the time to think about this and you just disagree with their choices? I don’t understand why people keep trying to criticize things that are border-line unknowable. How do you know how much the authors of these tools have thought about what would actually improve the user experience? How do you know they aren’t responding to real user feedback that asks for more color in places?

            We don’t all have to agree about the appropriate amount of color, but for crying out loud, stop insinuating that we aren’t taking the appropriate amount of time to even think about these things.

            1. 2

              “How much colour is too much colour” is kind of an empirical question; while design is certainly some matter of taste and trade-offs, generally speaking human brains all work roughly the same, so there is one (or a small range of) “perfect” designs. It seems quite a different problem than Ripgrep’s smart filtering you mentioned in your previous comment, which has more to do with personal preference and expectations.

              See for example these Attention management and Color and Popout pages; the context here is very different (flight control systems), but it’s essentially the same problem as colour usage in CLI programs. I don’t know if there’s more research on this (been meaning to search for this for a while, haven’t gotten around to it yet).

              Have some authors spent a long time thinking about this kind of stuff? Certainly. But it’s my observation based on various GitHub discussions and the like that a lot of the time it really does get added willy-nilly because it’s fashionable, so to speak. Not everything that is fashionable is also good; see the thin grey text on website fashion for example (which thankfully died down a wee bit now) which empirically makes things harder to read for many.

              When I worked on vim-go people would submit patches to the syntax highlighting all the time by adding something for some specific thing. Did that improve readability for some? Maybe, I don’t know. For a while most of these patches were accepted because “why not?” and because refusing patches is kind of draining, but all of the maintainers agreed that this added colouring didn’t really improve vim-go’s syntax highlighting and were superfluous at best. There certainly wasn’t a lot of thought put in to this on our part to be honest, and when we started putting thought in to it, it was too late and we didn’t want to remove anything and break people’s stuff.

              1. 6

                “How much colour is too much colour” is kind of an empirical question; while design is certainly some matter of taste and trade-offs, generally speaking human brains all work roughly the same, so there is one (or a small range of) “perfect” designs. It seems quite a different problem than Ripgrep’s smart filtering you mentioned in your previous comment, which has more to do with personal preference and expectations.

                While I agree that it’s a quantifiable question, there’s 2 classic problems here.

                All quantifications in user design are “70% of users find this useful” for statement A, and “60 % don’t find it useful” for statement B. The often committed mistake is then assuming that you should implement “A & ^B”, ignoring that you now need to analyse the overlap.

                The second is that good quantification are a lot of work and need tons of background knowledge, with standard books on color and interface perception doubling as effective close combat weapons.

                A classic answer to the above problem is that good UI uses at least two channels, potentially configurable. So if if the group that doesn’t find B useful isn’t having problems with it, having both is a good option. Your cited Color and Popout page is a very good example of that. And it gracefully degrades for people that do e.g. not see color well. And especially emoji based CLI programs do that very well: Emoji don’t take up a lot of space, are easily differentiable, are accessible to screen readers while still keeping their symbolic character - the line afterwards is the thing for people that need the details.

                While I agree with your fashion argument, but see it in a much more positive light: user interface trends have the benefit of making good basics the default - if they are successful. This is community practice learning - I would say that the phase of gray text made the design community realise that readability is not optional when reading text. This may seem trivial, but it isn’t unsurprising that this trend came up when visual splendor was much easier available in websites and the current focus of that time.

                For a practical summary of research and reading, I can highly recommend “Information Visualization: Perception for Design” by Colin Ware. Take care, though, it was updated for the 4th Edition this year and many vendors still try to sell you the 3rd. For a book of around 70$, I’d hate if you fell into that trap ;). It’s the book I learned from in University courses and found it very accessible, practical, but also scientifically rigorous. It also spends a lot of time on when visual encoding should be applied, when not and especially has clarity and accessibility as its biggest goals.

                Also, even scientific research isn’t protected of the fads you describe: for a long time, with enough computational power available, everyone tried to make visualisations 3 dimensional. That’s generally seen as a mistake today, because either you just add fake depth to you bar diagram while it still remains essentially 2D (wasting the channel), or you run into problems of perspective and occlusion, which make it hard to judge distances and relationships making you turn the image all the time, because 3D data is still projected on 2D. Reading 3D data is a special skill.

          2. 4

            What are some examples? Curious what makes you think that the authors did not consider user experience when implementing nonstandard features specifically in pursuit of user experience? No doubt their efforts may not land well with some of the users. I just think it’s a bit dismissive to assume that the authors didn’t put thought into their open source projects, and pretty rude to characterize the fruits of their labor as a “hyperactive toddler”.

            1. 8

              As a personal data point: I use fd, ripgrep, and hexyl, and they’re fine. However, I tried exa (a replacement for ls) and exa -l colors absolutely everything, which I find overwhelming compared to ls -l (which for me colors just the file/directories/symlinks). To me it seems like exa developers pushed it a bit too far :-)

              1. 5

                Cool. It definitely seems that exa in particular colorizes a lot of things by default. My initial thought is “wouldn’t it be nice if I could customize this” and it turns out you totally can via the EXA_COLORS variable (see man exa).

                I think the ideal colorized tool would roughly do the following: it would make coloring configurable, ship with reasonable defaults, and then some presets for users with disabilities, colorblindnesses, or those who prefer no color at all.

                1. 2

                  exa -lgh --color=never

                  seems flag heavy but that’s just me and there’s probably more than one way to do it

                  1. 7

                    Flag heaviness doesn’t matter much in this case though, since it can be trivially aliased in shell configuration.

              2. 3

                compared to ls -l (which for me colors just the file/directories/symlinks).

                This is likely local configuration, whether you’re aware of it or not. GNU ls will happily color more or fewer things, in different ways, based on the LS_COLORS environment variable and/or configuration files like ~/.dir_colors. See also the dircolors utility.

                1. 1

                  Interesting, TIL. I don’t have a ~/.dir_colors but LS_COLORS is indeed full of stuff (probably added by fish?). In any case, exa was a bit much, the permissions columns are very very colorful. Maybe it’s to incentivize me to use stricter permissions 😂

              3. 0

                Agree. Most people’s shell prompts are essentially rainbow unicorn vomit.

        3. 1

          If we wanted to get hand wavy about it, my personal take is that it’s some combination of lower barriers to entry to writing these kinds of tools and simultaneously providing more head space to even think about this stuff

          It seems plausible, adding colours or UI extensions sounds like a good “first patch” for people learning Rust and wanting to contribute to “real world” projects.

          1. 6

            That’s not exactly what I had in mind. The syntax highlighting that bat does, for example, is one of its central features AFAIK. I don’t know exactly how much integration work it took, but surely a decent chunk of the heavy lifting is being done by syntect. That’s what I mean by headspace and lower barriers to entry.

      2. 17

        why “rewritten in rust” and “overuse of colour and emojis” correlate?

        JS community does the same. I think it’s not specific to Rust, but specific to “modern” rewrites in general (modern for better or worse).

        I see a similar thing in C/C++ rewrites of old C software – htop and ncmpcpp both use colours while top and ncmpc did not. Newer languages, newer eyecandy.

        1. 5

          JS community does the same. I think it’s not specific to Rust, but specific to “modern” rewrites in general (modern for better or worse).

          The phrase “modern” is a particular pet peeve of mine. It’s thrown around a lot and doesn’t seem to add anything to most descriptions. It is evocative without being specific. It is criticism without insight. Tell me what makes it “modern” and why that is good. The term by itself means almost nothing which means it can be used anywhere.

          1. 2

            AIUI “modern” as it relates to TUIs means “written since the ascendence of TERM=xterm-256color and Unicode support, and probably requires a compiler from the last 10 years to build.” Design wise it’s the opposite of “retro”

            I don’t see how it’s a criticism (what’s it criticizing?), Or why every word needs to be somehow insightful, It’s just a statement that it’s a departure from tradition. It’s like putting a NEW sticker on a product. It doesn’t mean anything more than “takes more current design trends into account than last year’s model”

            1. 1

              I think a “new” sticker on a product tells you more than sticking “modern” in a software project page. At least you know it isn’t used/refurbished. What constitutes modern is a moving target. It may be helpful if you had knowledge of the domain in which it’s being used, but otherwise it’s just fluff.

              Worse, I think it doesn’t present a nuanced view of the design choices that go into the product. In my mind it subtlety indicates that old is bad and new is good. That thinking discourages you from learning from the past or considering the trade offs being made.

              Moreover I think it bugs me because I work in a NodeJS shop. When I ask people what’s great about a package they tell me it’s modern. It’s just modern this or modern that. It barely means anything. So maybe take this with a grain of salt.

              1. 2

                Huh. I think this must be a cultural difference. Working with C and C++ packages, ‘modern’ has a bit more meaning because of the significant changes that have happened in the languages themselves in a reasonably recent fraction of their existence. (For example, “modern” C++ generally avoids raw pointers, “modern” C generally doesn’t bother with weird corner cases on machines that aren’t 32 or 64 bit architectures I can currently buy)

                It’s even true to a lesser extent in python, “modern” usually refers to async/generators/iterators as much as possible, while I agree that “modern” definitely does lack nuance, it fits in an apt package description and means roughly “architected after 2010,” and I think this is a reasonable use of 6characters.

                1. 2

                  Here’s another way of looking at it:

                  You make a library. It’s nice and new and modern. You put up a website that tells people that your package is modern. The website is good, the package is good. It’s a solved problem so you don’t work on it any more. Ten years pass and your website is still claiming that it is modern. Is it? Are there other ways that you could have described your project that would still be valid in ten years? In twenty years?

                  The definition of modern exists in flux and is tied to a community, to established practices, and, critically, a place in time. It is not very descriptive in and of itself. It’s a nod, a wink, and a nudge nudge to other people in the community that share the relevant context.

                  1. 1

                    I definitely see your point, but I’d also argue that if I put something on the internet and left it alone for 10 years, it would be obvious that it’s “modern” (if it’s still up at all) is that of another age. If you’d done this 10 years ago, you’d likely be hosted by sourceforge, which these days is pretty indicative of inactivity. It also doesn’t change that your package is appreciably different than the ones serving a similar purpose that are older.

                    There are buildings almost 100 years old that count as “modern” (also, there are ‘modern’ buildings made after ‘postmodern’ ones. Wat?) It’s a deliberately vague term for roughly “minimal ornament, eschewing tradition, and being upfront about embracing the materials and techniques of the time” what “the time” is is usually obvious due to this (and IMO it is in software as well). The operative part isn’t that it’s literally new, more that it’s a departure from what was current. and when a modern thing gets old, it doesn’t stop being modern, it just gets sidelined by things labelled modern that embrace the tools and techniques of a later time. Architects and artists don’t have an issue with this, why should we?

                    Libuv is I think a good example IMO. I’d call it “modern”, but it’s not new. That said, it doesn’t claim to be.

                    Honestly, given how tricky it is for me to pin this down I feel like I should agree with you that it’s cruft, but I just… Don’t… I think it’s cause there’s such a strong precedent in art and architecture. Last time I was there, Half of the museum of modern art was items from before the Beatles.

                    I do think it sounds a bit presumptuous

                    1. 1

                      Honestly, given how tricky it is for me to pin this down I feel like I should agree with you that it’s cruft, but I just… Don’t… I think it’s cause there’s such a strong precedent in art and architecture. Last time I was there, Half of the museum of modern art was items from before the Beatles.

                      Haha, well, I think we’ll have to agree to disagree then.

                      Ultimately, I’m being a bit of hardliner. There is value in short hand and to be effective we need to understand things in their context. I think being explicit allows you to reach a wider audience, but it is more work and sometimes we don’t have the extra energy to spread around. I’d rather have the package exist with imprecise language than have no package at all.

        2. 2

          That’s a fair point, I guess I have just been noticing more Rust rewrites, or haven’t been taking JS CLI-software seriously?

          1. 6

            I don’t blame you – I haven’t been taking JS software seriously either ;) Whenever I see an interesting project with a package.json in it I go “ugh, maybe next time”. Rust rewrites at least don’t contribute to the trend of making the software run slower more rapidly than the computers are getting faster.

      3. 10

        but as someone who disables colours in my terminal sessions

        As someone who appreciates colors in the terminal, I’m pretty into it. I think it’s just a personal preference.

        1. 2

          Wrong, but ok ;)

          But seriously: I don’t think so many tools and projects would be putting the effort into looking the way they do, if nobody wanted it. I just think that colour is better used sparingly, so that issues that really need your attention are easier to spot.

      4. 10

        Because it’s easy in Rust. It has first-class Unicode support, and convenient access to cross-platform terminal-coloring crates.

      5. 5

        I suspect that the pool of tool users has expanded to incorporate people with different learning styles, and also that as times change, the aesthetic preferences of new users track aesthetic changes in culture as a whole (like slang usage and music tastes).

        Personally, I find color extremely useful in output as it helps me focus quickly on important portions of the output first, and then lets me read the rest of the output in leisure. I’ve been using *nix since I was a kid, and watching tools evolve to have color output has been a joy. I do find certain tools to be overly colorful, and certain new tools to not fit my personal workflow or philosophy of tooling (bat isn’t my cup of tea, for example). That said not all “modern” rewrites feature color, choose being the example that comes up for me immediately.

        (On emojis I’m not really sure, and I haven’t really seen much emoji use outside of READMEs and such. I do appreciate using the checkmark Unicode character instead of the ASCII [x] for example, but otherwise I’m not sure.)

      6. 3

        I think it is more of a new tool trend than new language trend. I see similar issue in other new tools not written in Rust.

      7. 2

        Perhaps it’s simply that Rust has empowered a lot of young people, and young people like colors and emojis?

      8. 1

        I wrote this blog post as an answer to this article. I am also wondering why this “overuse of color” is so popular among “rewritten in rust” kind of tools.

      9. 1

        I think this is generally true of CLI tools written since Unicode support in terminals and languages is commonplace. I don’t have any examples but I’ve gotten a similar impression from the go.community. I think emojis and colors in terminals are kind of in Vogue right now, as is rewriting things in rust, so… Yeah, that’s my hypothesis on the correlation.

        Aside, as someone with rather bad visual acuity and no nostalgia for the 80s, I like it.

    5. 7

      I wrote cw as a hopefully-fast wc clone, with multiple code paths for various flag combinations and support for threading. Was a fun little exercise.

      Also note exa is currently languishing, and you probably want to use a fork for the time being.

      1. 3

        That’s silly. A commit was made in January. It’s easy enough to do a personal build. It’s not ideal, but it’s certainly not the only project like that. Plus many times, a user might want HEAD build anyway. Ideally once a year release would be nice, but as long as commits are still happening, that’s not “languishing”

        1. 5

          adjective: languishing — failing to make progress

          Seems appropriate to me, it’s been 59 weeks since the last release, 32 weeks since the last commit, and there’s a fair few fixes for quite annoying issues that have piled up since.

          Someone suggesting building your own from a fork of a repository is unlikely to require lecturing on the ease of building your own.

          1. [Comment from banned user removed]

    6. 7

      Interesting how a nicely compiled list of CLI tools & useful animated GIFs of their usage has resulted in flame wars over programming language, color choices, emoji uses, and pipelines.

      1. 6

        I’m pretty sure lobste.rs is an acronym that means ‘mostly flame wars about rust and go’.

    7. 5

      One area that seems missing in all this is something for user focused network diagnostics (think ifconfig / ping / traceroute). Is there anything that can show me at a glance multiple OSI levels why something is broken. Say I just know that I can’t see google.com. It would be nice to have 1 command that shows that this is because {cable_unplugged, dns_broken, no_packets_route_past_your_router, …}

      1. 1

        Like the useless “network diagnostics wizard” in Windows? :)

    8. 4

      Big fan of all these tools. Honestly, my primary reason for fandom is that the language is much more approachable - easier syntax, easier dependency management, easier compilation. This means I can edit it much more easily. I have insignificant forks for these for local nonsense. This is the good life.

      I’d honestly never even try to edit find, mostly because the thought of having to write C bothers me. I just know I’ll fuck up memory management somehow and the tool will segfault.

      Plus either these guys are superstars or the language lends itself to making the architecture really good. I added an xsv command with some flags to do some stats and it was like 30 mins of work. Then I compile it, stick it in my ~/bin and I am more powerful than the gods. At the time, I learnt enough Rust to make that happen. Barely even knew the language well.

    9. 2

      I’ll mention another two rust alternatives to zoxide: pazi and autojump-rs.

      Disclaimer, I wrote pazi. It’s functionally pretty similar to zoxide from what I can tell, though in pazi I’ve managed to code up some benchmark tests to better figure out how to make it fast.

      It’s neat how many autojumpers have been written in rust, and next time I get some spare time I’ll have to add zoxide into the benchmark matrix and see how pazi does against it!

    10. 2

      My main problems with these similar-but-not-complete rewrites is often the same.

      a) I don’t see a problem with the old one, which is available everywhere - so I only install it on one machine and then forget to use it. (Think I tried exa when it was really new and found the binary again after months, or years)

      b) they’re not packaged in the distros and they move too fast to be “stable” and available, thus a)

      The only exception is ripgrep which I’d been installing on my machines basically since it came out (replacing ag) - because it’s something I typically use several times per day and it’s 100% better for me. I don’t use cat often (and then it works) and I’m not a huge fan of exa, ls is fine for me

      No, I’m not against progress per se, but I dislike relying on special tools for a 0-5% productivity increase and the possibility it’s not available.

    11. 1

      It would be much better to add -g (or similar flag) to the already existing nice-working command line tools.

      1. 15

        It would be much better to add -g (or similar flag) to the already existing nice-working command line tools.

        Most of the existing command line tools have an interface that hasn’t changed in the last 20+ years.
        If you want to try something else, it can be a lot easier to create a new tool.

        Modifying a Tool

        Pros:

        • If your changes are accepted then someday you may not have to install your package.
        • Someday other people might be able to use your package without installing it.
        • Collaborating with others can be a positive experience.

        Cons:

        • Upstream sometimes makes decisions you disagree with. RMS can and does (did?) veto decisions on some GNU projects.
        • The existing tools are written in C and sometimes the source is a maze of twisty ifdefs. You may want to use a different language or have different requirements for portability. The reference implementation of NTP, for example, still supports a number of discontinued OSes and architectures.
        • Some changes are effectively impossible to make. You might be able to add an option but you probably can’t change the meaning of an existing option or the default behavior.
        • There’s more to open source development than changing the code. The maintainers have their own goals for the project and some projects require a CLA, peer review, public discussions, etc. If your vision and theirs disagree then the change won’t be accepted, e.g. the repeated attempts to introduce strlcpy, strlcat, strtonum to glibc.
        • There may be multiple upstreams. The best possible case for changing the ls command, for example, in all open source OSes is probably 3+ years.

        Edited to add additional reasons and explanations.

        1. 1

          Thanks for the list, in my opinion the pros outweigh the cons here.

          1. 4

            When do they not? At what point are we allowed to reimagine new tools without someone complaining that we should have just contributed it back to coreutils? Like, can you imagine someone submitting a patch to cat to enable syntax highlighting and that ever being accepted?

            1. 1

              Couple of points:

              1. I haven’t complained, I suggested a better solution in my opinion.
              2. I think there is a difference between a fork and a rewrite. Fork is better since it keeps all the knowledge and it is implemented in the original implementation language.
              3. I think that if a contribution is objectively bringing benefits to most of the users, it will be merged. In this particular example, I don’t see how coloring the terminal by cat can be useful for most of the users.

              For example diff –color option was added to GNU diffutils 3.4 (2016-08-08)

              1. 14

                In this particular example, I don’t see how coloring the terminal by cat can be useful for most of the users.

                Which is exactly why people go out and build their own tools. Because maybe there are different target audiences or maybe there are different definitions for the word “useful.”

                To be honest, I cannot understand your position at all. If these were small tweaks, okay, maybe try to send a patch upstream or maintain a fork. But we’re talking about major rethinking on how these tools work. It’s just not practical to modify the original tools to do it. Look at GNU grep for example. People have tried and failed to make it uses parallelism, because the existing code is such a mangled mess that it just isn’t feasible to do it.

                I mean, you suggest “just contribute the changes back upstream,” but upon further inspection, what you actually meant was, “just contribute the changes that I think are “objectively” bringing benefits to most of the users back upstream.” That’s quite a different suggestion, don’tyathink? In particular, it leaves itself a gaping hole: what am I supposed to do when I want to write some software that’s kind of like an existing piece of software, but where you (or whoever) thinks my changes are not “objectively” bringing benefits to most users? What am I supposed to do? Eh?

                I haven’t complained, I suggested a better solution in my opinion.

                When I first announced ripgrep ~4 years ago, someone on HN moralized me about why I didn’t just contribute the changes back to GNU grep instead. It was an intensely frustrating exchange, so perhaps I’m still sensitive to it. But the suggestion is honestly just so ridiculous that I don’t understand how anyone could seriously make it.

                It’s the cycle of software. Software gets written, it gets popular, it (usually) stagnates into maintenance mode because you wind up wanting to avoid breaking changes. There are some obvious exceptions to this, but it’s pretty common for a program’s lifecycle to look like this. And then eventually, someone says, “no, it can be done better, but to do better, we need to revisit the problem from first principles.” It’s just fundamentally incompatible with programs that have reached a level of maturity in their lifecycle where major changes are either impossible or monumentally difficult. So your one sentence off-the-cuff suggestion winds up ringing incredibly hollow.

                1. 1

                  Very interesting conversation, let’s continue.

                  It’s just not practical to modify the original tools to do it. Look at GNU grep for example. People have tried and failed to make it uses parallelism, because the existing code is such a mangled mess that it just isn’t feasible to do it.

                  It is your opinion that it is not practical. For you or others it can seem like a “mangled mess”. This is (computer) science, given a model, add some properties/abilities to it preserving all the previous results. I trust contributors and maintainers to decide if contributions are worthy of inclusion.

                  It’s the cycle of software. Software gets written, it gets popular, it (usually) stagnates

                  My opinion is that it saturates, not stagnates. Look at LaTeX for the explicit example of this. In UNIX philosophy one tool one (small) goal (list files in a directory for example), once the goal is achieved the tool is “done” and only need to be maintained - and I think this is OK. I don’t see the need to rewrite that software after it “cycle” completed (?). Also the “rewrite” is sort of a lie. No one can guarantee 100% backward compatibility, since it takes similar amount of time to test on different platforms/architectures and by that time, the time for a next cycle will come. Thank you, but no thank you.

                  I think the bigger desire of the rewrite is that, devs don’t want to really dig into (old) code and understand what previous devs meant. But its their problem. I’ve seen pull requests in scientific software that have been opened for years (PhD thesis length) and they got merged when the time and quality was right.

                  When I say “I think this feature is not needed or needed” I mean I don’t care that it is there or not, I trust maintainers to decide for me what is good or not. And if I don’t like how the tool evolves I will either (1) ask maintainers, (2) create a pull request that does what I want, (3) fork it, (4) find other tool. If I’m not competent enough to create a pull request that has code quality on par with the code base I won’t say: “because the existing code is such a mangled mess that it just isn’t feasible to do it.”.

                  1. 8

                    My opinion is that it saturates, not stagnates. Look at LaTeX for the explicit example of this. In UNIX philosophy one tool one (small) goal (list files in a directory for example), once the goal is achieved the tool is “done” and only need to be maintained - and I think this is OK. I don’t see the need to rewrite that software after it “cycle” completed (?)

                    “saturates” is fine. I never implied that software had to get rewritten, just that it in most cases, it must IF certain classes of improvements that end users find useful require rethinking the problem from first principles and the current maintainers (or ecosystem) aren’t willing to do that. If there’s one thing about these “rewrites” that is impossible to deny, it’s that there exists a fairly sizable set of people who find them useful and appreciate the extra features.

                    Also the “rewrite” is sort of a lie. No one can guarantee 100% backward compatibility

                    Oh, good thing none of these tools guarantee 100% backward compatibility then. (Have you even looked at or tried the tools we’re discussing? Or did you just skim the OP?) “rewrite” is being used in a very vague manner. A criticism of the headline based on a strict interpretation of the words used is totally fair, but it’s completely uninteresting.

                    As for the rest of your comment, I find many of the things you’re saying to be so far removed from reality that I don’t think there is any meaningful shared understanding between us.

                    1. 0

                      Sounds good.

                      so far removed from reality

                      From your reality.

    12. 1

      Do any systems bundle tools like these in place of, and with the same names as, the POSIX tools they might replace? In the same way that more is often the same file as less?

    13. [Comment removed by author]

      1. 45

        Being the maintainer of three of these tools and a long time command-line enthusiast, it’s just sad to read a comment like this.

        I can understand that people do not want to try out new tools. Fine. But why criticize them without even taking a proper look?

        As others have already pointed out, most of these tools are not breaking pipeline usage. The README for bat has a whole section on “integration with other tools”, showing multiple examples of pipeline usage. fd’s README also points out several examples on how to pipe fd’s output to other programs (e.g. xargs, parallel, as-tree).

        I have a feeling that a lot of people react allergic to colorized command-line output, and I absolutely do not understand why. Just because a tool doesn’t look like 1970 doesn’t mean that it’s not a proper citizen of the command-line world. But that’s not even the point. My tools do not provide colorized output to look “fancy”. They use colorized output because I think it’s actually pretty damn useful. Also, there is always the possibility to easily switch them off if you really don’t like it (set NO_COLOR or use --color=never in an alias).

        Finally, I am really sick of the implied “rewritten in Rust” stigma. No, I did not write these tools in Rust because I wanted to demonstrate something. Or to show off. I simply like to try different programming languages from time to time and I think that Rust is an awesome fit for command-line tools. fd was actually started as a C++ project. I just moved it to Rust at some point. Mainly because of boost packaging issues and because I discovered the awesome Rust libraries by @burntsushi. Criticizing a tool for the language it’s written in is fine as long as there is an actual argument why that programming language was a poor choice for the job.

        1. 3

          I have a feeling that a lot of people react allergic to colorized command-line output, and I absolutely do not understand why.

          First, let me say that I really appreciate the effort in trying to keep moving things forward. That means change, and change is not always good.

          I’m pretty allergic to colors in command line tools - but that is mostly because of awful default logging/outpu formats from ruby/rails - and not for tools like these that properly detect piping etc.

          I’m also slow to adopt new tools - I still tend to prefer gnu find to fd, haven’t quite been able to put fzf to work - but I’m not quite hopeless - I do use ripgrep quite a bit.

          I think quality matters (more) when using colorization - and it takes a village to get there. The recent generations of colorized interactive ruby output is a boon (on Linux) - it looks good, and if I copy from gnome terminal and paste into nvim - the escape séquences vanish. But on windows servers with the standard windows console - even the line breaks do not work properly, and colorization is hit and miss (colors or escape sequences? It depends…).

          Many of us need to work on various heterogeneous deployment servers not under our control - and that makes us thirsty for the simplest tools - to the point of sometimes preferring a predictable dull rock to a scalpel that might shatter at any moment. But it sometimes makes the transitionfrom skinning rabbit to preformingbbrain surgery… More challenging.

        2. 1

          I have a feeling that a lot of people react allergic to colorized command-line output, and I absolutely do not understand why.

          First, let me say that I really appreciate the effort in trying to keep moving things forward. That means change, and change is not always good.

          I’m pretty allergic to colors in command line tools - but that is mostly because of awful default logging/outpu formats from ruby/rails - and not for tools like these that properly detect piping etc.

          I’m also slow to adopt new tools - I still tend to prefer gnu find to fd, haven’t quite been able to put fzf to work - but I’m not quite hopeless - I do use ripgrep quite a bit.

          I think quality matters (more) when using colorization - and it takes a village to get there. The recent generations of colorized interactive ruby output is a boon (on Linux) - it looks good, and if I copy from gnome terminal and paste into nvim - the escape séquences vanish. But on windows servers with the standard windows console - even the line breaks do not work properly, and colorization is hit and miss (colors or escape sequences? It depends…).

          Many of us need to work on various heterogeneous deployment servers not under our control - and that makes us thirsty for the simplest tools - to the point of sometimes preferring a predictable dull rock to a scalpel that might shatter at any moment. But it sometimes makes the transitionfrom skinning rabbit to preformingbbrain surgery… More challenging.

        3. [Comment removed by author]

          1. 14

            sad as in trump-sad or you are sad? if the latter, i’m sorry.

            Sorry, this seems completely uncalled for. One of the reasons I like this site is its relative lack of political drama and focus on technology. I don’t know what Donald Trump has to do with this technical disagreement, but I don’t think he’s relevant to it, and I don’t think the mention of him will resolve anything.

            1. 1

              *sigh

              i’ll just let myself out. thanks for all the fish.

          2. 14

            Thank you for the response.

            sad as in trump-sad or you are sad? if the latter, i’m sorry.

            It makes me sad because you invest countless hours into a project and then someone disregards it within seconds with a low-effort comment like this. Other people looking at this thread will see the wrong statements (“breaking pipelines”) and might take them for granted.

            i dislike it because it’s an ugly hack on top of emulated 70s hardware, and breaks.

            I absolutely agree with you on this point. I just don’t see any alternative. Is someone working on a completely redesigned, modern terminal API? It could probably still support ANSI escape codes for backwards compatibility.

            somehow git diff is broken on one of my machines where the automatically invoked “less” isn’t rendering escapes but printing them verbatim.

            I know you are probably not looking for a solution, but the problem might be that the -R/--RAW-CONTROL-CHARS command-line option is not passed to less on that system. It tells less to interpret those escape sequences. I guess most distributions set LESS="-R" by default.

            the title of the article says “rewritten in rust” as if it’s a seal of quality. i wouldn’t otherwise have made fun of that.

            That’s your interpretation, but okay. I see your point. It’s definitely not a seal of quality per se. There are, however, a few things that can be inferred from a choice of programming language (expected startup latency, memory safety guarantees, portability, ease of distribution, …). As stated above, I think that Rust is actually an excellent choice for command line tools. Because it delivers on all of the mentioned aspects. That doesn’t mean that it’s the only choice though.

      2. 26

        Most of the plumbing-replacement tools actually detect pipes and switch to machine-friendly output.

        1. 8

          I can confirm this for fd, bat and exa. I just had a quick look at those and at a first glance all of them seem to support pipelines just fine.

        2. 3

          let me flame in peace!!1

          thanks, i wasn’t aware of that magick.

          1. 3

            Its pretty common actually https://man7.org/linux/man-pages/man3/isatty.3.html not unique to rust for that matter.

            Simply as easy as if (isatty(fileno(stdout))) … else …

            1. 11

              But! Quite annoying to do correctly on Windows in a way that works in both the cmd and MSYS shells: https://github.com/softprops/atty/blob/7b5df17888997d57c2c1c8f91da1db5691f49953/src/lib.rs#L101-L154

              That one was quite fun to figure out. I cribbed it from some other project, but can’t remember which one.

              1. 3

                I maintain a terminal library in D and yeah it is interesting to do it well on Windows (especially if you want to do a gui and a cli in the same exe… I thought that was impossible until I did it with a pile of hacks) but I didn’t even try the msys then, yikes, what a hassle.

                But one of the decisions I made with my library is it is invalid to use the Terminal structure on a pipe. So I force you to branch outside instead of trying to transparently handle both. I kinda regret that since so many programs can benefit from the “just-works” aspect, but it is true you can often do much better UIs if the application just branches at the top level.

                1. 2

                  I struggled with this as well, and made a post about it which may be of interest: https://www.devever.net/~hl/win32con

                  1. 1

                    Thanks! I like the SW_HIDE idea, though I went with a different approach. What I ended up doing is making a .bat file that calls the exe and just instructed the cli users to use that instead of the exe itself which worked cuz a bunch of cli users were used to a bat file anyway (it used to do chcp and when I looked at it, I told them about WriteConsoleW and that chcp hack is terrible, but the batch file was still there so that meant I could reuse it here).

                    Then the exe checks AttachConsole parent and reopens the stdin/out to it… as long as they were already null (otherwise they were previously redirected and don’t want to overwrite that).

                    Otherwise though the exe itself is the Windows subsystem so the double click and shortcut users just work without any flicker. Then the bat file causes the command shell to wait until the program finishes so the prompt and the output don’t stomp on each other.

                    The only thing kinda iffy to me is if the output is piped. For the console, WriteConsoleW with utf-16 just works. But for the pipe, I output utf-8 and it seems to be fine and the users happy with it, but I remain unsure what it should really do. I suspect Windows detects based on the first two bytes from some tests but I’m not fully happy with that part yet. Especially since you can pipe binary data too so idk.

              2. 1

                I’ve never used windows so I’ll take your word for it >.<

        3. 2

          Though, even if the fancy looking output was a better user experience (usually, I don’t think it is), switching output is also not particularly friendly; Now I need to think about two different formats when I use the tool.

      3. 3

        Breaking pipelines is not an issue if you still have the coreutils implementations still around. e.g: use bat and exa for regular use but when you need pipes use cat and ls. I don’t see a problem with this and I think the concepts nushell (also mentioned in the article) brings to the table are definitely useful, albeit not implemented there for the first time.

        Edit: typo, crossed out extra word

        1. 3

          of course there are the standard tools.

          i just think it’s a bit weird to hype the console (look ma, no gui!) and at the same time ignore the concepts which make it so powerful. some of these tools are okish, but colorization and emulating guis are great at preventing the piping of output to further processing.

          tools and pipelines together are bigger than their parts.

          and i think that rust is overhyped, but that’s my problem ;)

          1. 29

            and at the same time ignore the concepts which make it so powerful

            If this were true, then we wouldn’t have used tty detection to change the output format to be compatible with pipelines.

            Which, by the way, is exactly what ls has been doing for years. Its default output format to a tty isn’t pipeline friendly. Maybe we’ve been learning from older tools more than you think.

            1. 3

              It can also be quite confusing to have the output change when you aren’t looking at it.

              1. 17

                It’s just funny when I see people raging about this for the Rust tools (it seems to happen quite frequently), but nobody thinks to do it for any of the older tools. And it’s even better when people use this as a launching point to accuse us of ignorance. Give me a break.

                1. 2

                  I think you will find people (suckless type people) do complain about gnu coreutils doing stuff like this. I admit it is useful and nice lots of the time too.

                  1. 5

                    BSD ls does it too. Or at least, the BSD ls command on my mac mini box does.

                    Does there exist a version of ls that doesn’t do this?

                    1. 7

                      The coffee is still kicking in but I don’t think Plan9 ls does this.

                      1. 4

                        Plan 9 ls doesn’t do anything different, and in fact, has no way to do it short of becoming a graphical program: there’s no concept of “terminal width”. You need to open the draw device and get the window size, look at the (possibly variable-width) font, and compute line widths.

                        There are also no control characters that handle cursor positioning, other than ‘\b’. The plan 9 console is a simple textual transcript. If you want graphics, use graphics – it doesn’t pop open a new window, so there’s no reason to support semigraphical features when full graphics are available.

                      2. 3

                        Correct – Plan 9 doesn’t. The philosophy behind it is discussed in cat -v Considered Harmful by Pike and Kernighan (page 5).

                        Though, to be technical, the Plan9 console isn’t really a tty, so Plan9 ls was designed for a slightly different environment. However, the design of ls (and the Plan 9 console) are rooted in this philosophy.

                    2. 4

                      I don’t think sbase ls does:

                      https://git.suckless.org/sbase/file/ls.c.html

                    3. 2

                      OpenBSD and NetBSD’s ls do not have this flag (-G).

                    4. 1

                      plan9 ls? ;)

                      it’s not that i dislike the functionality but how it’s implemented. i’d be interested in having such tools build in a composable way, out of multiple commands. default invokations then could be build via shellscript (and shipped together with the commands).

                      1. 9

                        And do you use Plan9’s version of ls to avoid the “nice” display output of the BSD and GNU versions of ls? If not, why do you use tools that ignore the concepts of composition?

                        it’s not that i dislike the functionality but how it’s implemented

                        That’s surprising, and even re-reading your comments above, I don’t really see this coming across. You seem to be ranting against the functionality itself and how it breaks pipelining. But we handle those cases by disabling the “nice” functionality so that pipelining works.

                        i’d be interested in having such tools build in a composable way

                        Easier said than done. Everyone worships at the alter of perfect composition (including myself), but actually achieving it is another matter entirely. It seems pretty unfair to hold that against us. It’s not unreasonable to think that in certain environments, composition, development effort and the user experience are at odds with one another. (Where the development effort to make some of these nicer features truly compositional is probably on the order of “build a new kind of shell environment and convince everyone to use it,” which really isn’t feasible.) People actually building these tools have to pick a design point in this space, and we get rewarded with people calling us ignorant. Go figure.

                        1. 1

                          And do you use Plan9’s version of ls to avoid the “nice” display output of the BSD and GNU versions of ls? If not, why do you use tools that ignore the concepts of composition?

                          i’ve tried to use the tools of plan9port by putting them in front of $PATH, unfortunatly the assumptions about the environment are sufficiently different from linux so that i’ve had other problems then.

                          That’s surprising, and even re-reading your comments above, I don’t really see this coming across. You seem to be ranting against the functionality itself and how it breaks pipelining. But we handle those cases by disabling the “nice” functionality so that pipelining works.

                          i dislike the complexity such tools introduce by putting together unrelated things like colorization and git handling (sorry, bat is the first thing in the list..). i dislike magic as i’ve put so much time into debugging it when it breaks.

                          Easier said than done. Everyone worships at the alter of perfect composition (including myself), but actually achieving it is another matter entirely.

                          indeed.

                          It seems pretty unfair to hold that against us. It’s not unreasonable to think that in certain environments, composition, development effort and the user experience are at odds with one another. (Where the development effort to make some of these nicer features truly compositional is probably on the order of “build a new kind of shell environment and convince everyone to use it,” which really isn’t feasible.) People actually building these tools have to pick a design point in this space, and we get rewarded with people calling us ignorant. Go figure.

                          i don’t know, you got rewarded with upvotes :) i’ve never called people ignorant, i’ve said that i find it weird to use an environment which lives by simplicity and then building things which combine several complicated tasks into one.

                          1. 8

                            i’ve never called people ignorant

                            You said:

                            i just think it’s a bit weird to hype the console (look ma, no gui!) and at the same time ignore the concepts which make it so powerful

                            Emphasis mine. Sure looks like you’re calling us ignorant to me.

                            It’d just be nice to see folks be consistent. If you’re going to get all cranky about new tools introducing features that aren’t compositional, then at least get cranky about all the old tools that do it too. Because then, just maybe, it’s not just a bunch of overhyped new-fangled nonsense, and maybe there’s actually something to it. The giants we stand on also made choices that sacrifice composition.

                            1. 1

                              Sure looks like you’re calling us ignorant to me.

                              i didn’t know that ignoring things, as in “conscious decision” is the same as being ignorant, but then, i’m no native speaker.

                              believe me, i’m cranky about most old (like the gnu stuff?) tools too. that’s why i’d like to see new tools which aren’t like the old ones.

                              the funny thing is that no one has thrown the old “then write it yourself” in my direction, which only would be fair.

                              1. 8

                                the funny thing is that no one has thrown the old “then write it yourself” in my direction, which only would be fair.

                                I try to avoid saying things like that in discussions such as these. I think it’s overused as a way to dismiss others’ concerns. And of course, most people are already aware that such a thing is possible and don’t need to be reminded of it. I might as well just say nothing at all at that point.

                                I just find your particular form of criticism to be popular yet shallow. People love to jump all over these new fangled tools as if their authors are blubbering idiots that totally ignored the entire history of computing or are blissfully unaware of the Unix philosophy. It’s a lazy critique because violating composition is nothing new. We’re just doubling down, people don’t like it and for some reason like to insult the authors of these tools or at least denigrate their work. It’s annoying. (It’s a pretty common tactic used in lazy online banter, and is used in all manner of scenarios. I think we’d all be a lot better off if we held ourselves to a higher standard of discourse.)

      4. 3

        Could you elaborate a bit more on breaking pipelines?

        1. 4

          Let’s assume you were used to using cat somefile | less (ignoring for a moment that you could just less somefile) and swapped in an alternate tool for cat, let’s say mycat. Now mycat has a bunch of cool features like colorized output or wizbang progress bars (again, ignoring that many find colored output or wizbang features distasteful….). When just using mycat somefile everything looks great! But then you pipe mycat somefile | less and get all kinds of garbage, if less even works at all!

          This is one way to breaki pipelining. The reason it happens is mycat is probably using a bunch of TTY control codes, or ANSI color codes to implement all those features. However, when that output is piped through to less, the pipe isn’t a TTY it’s just a raw text stream. So those control codes don’t get interpreted and instead just appear on the screen in raw form.

          mycat can fix this by first checking if stdout is a TTY or not, and if so printing it’s control codes, but if not just printing raw text.

          There are other ways you can break pipelining, such as not handling a SIGPIPE correctly (trying to write to pipe who’s reading end has been closed).

          Rust has facilities to handle these cases, and most common Rust applications handle these cases just fine. It is however a common footgun since if you’re implementing a new tool and not familiar with how all this traditionally works you may not know to check for these kinds of things.