With respect to “Every computer is different”, chezmoi has explicit support for this: you can use templates to handle small differences (e.g. tweak your gitconfig for home or work machines, or tweak your .zshrc for Linux or macOS), and include or exclude dotfiles completely for large differences (e.g. only include a PowerShell config on Windows). See this page in the user guide for more details.
Disclaimer: I’m the author of chezmoi.
Thank you for making chezmoi, I use it on several Mac and Linux machines and it works great. Much easier than trying to manage my dotfiles in Ansible, which was my previous plan.
Perhaps the killer app for me is that if there’s some weird hack I do to get something working on a given machine, I can document it in code e.g. in my setup scripts, so that if I have to (god forbid) wipe the machine and start over, I can just run the hacks again instead of trying to remember what the heck I did last time. This happens with depressing frequency because our entire field is bad at what we do.
If anyone wants to see the ungodly mess I’ve made of my dotfiles, here’s my chezmoi repo.
wow, thanks for sharing your project, I had never heard of it. between the template support, and the thorough documentation, I think I may have to give this a try.
I’m currently running with the “$HOME is a git repo” method, which has been working well but obviously has at least one major footgun to avoid, and doesn’t really work well with having multiple systems that require different config/options.
I recently started a new job and for the first time in years decided to split my personal and work stuff onto two separate computers, so I’ll want some configuration that follows me between machines. This also gave me a chance to rethink my shell environment generally (e.g. moving from bash to zsh).
Choosing chezmoi was one of my first decisions and it’s been great. Thank you for your work on this project!
I never really got into dotfiles. My problem is:
Pulling dotfiles on to a machine can be simple. Installing chezmoi is a one line command that only requires a shell and curl/wget, nothing else. You don’t need to install a scripting language runtime, or even git (chezmoi has a builtin git client using go-git if no git binary is present on the machine).
Even if you rarely configure a primary system, I’m sure that you use a version control system for your code. Using any dotfile manager (there are many to chose from) brings the benefit of version control to your dotfiles, including easy rollback/undo (great for experimenting) and backups.
That would be nice if the platform I work on supported Go :) - in all seriousness, while git is likely installed, the fact of the matter is it’d be ridiculous to install my configuration environment onto a client’s machine, let alone account on their own system.
I don’t really change my configuration or deviate much from defaults either. It’s massive overhead for like what, 5 lines of .vimrc? Not to mention for i.e. GUI stuff, the config files in the XDG dir are possibly fragile across versions and not worth including in a scheme.
if the platform I work on supported Go
if the platform I work on supported Go
What platform is it?
I’m in a pretty similar boat. I could use VS Code with a vi keybindings plugin, which is pretty great, or I could get better at vim/neovim, which is installed on literally every machine I have to work with, which often have to get worked on remotely or ad-hoc or through weird VPN’s. And I’m now even the person who writes the default .vimrc file, as long as other people are happy with it.
The fish shell is worth it though. I wish I could install fish as default on all our work machines, but our infrastructure involves too many terrible bash scripts and people keep adding more. (Don’t say “use fish for interactive use and bash for scripting”, there’s (bad) reasons we can’t do that.)
Well, at least you can use VSC on the local system and usually use it as the frontend for a remote system, almost like what the TRAMP people on Emacs do. (I’ve never had good luck with TRAMP, but the VSC remote editor works pretty well the few times I’ve tried it…. except none of my remote servers are Linux 🙄)
Yeah I’ve used VSC a few times. It’s pretty great, but again, I’m often doing this on a generic service-terminal-we-provide-with-our-hardware Linux laptop or something of that nature. I haven’t yet had to hack a client’s workstation or a hotel courtesy PC to fix something, but I could see it happening.
Curious what issues you’ve had with tramp
Two issues from my list:
I can’t help with the first.
For me, sudo is :sudo-edit
Ahhh interesting.. thanks. I’ll be experimenting immediately if that cover also org remote execution. Thanks
After going through some of these, I wanna share pgrep(1)/pkill(1) with folks: https://www.man7.org/linux/man-pages/man1/pgrep.1.html
I learned about pgrep(1) super late, and they made my life a lot easier. They’re available on macOS, Linux (as part of procps-ng almost always installed by default), *BSD and it’s also provided by busybox.
home-manager on Nix has got me up and running relatively quickly on many machine already.
A huge boon for me has been using direnv.
You can set it up for most tools, and then suddenly you can have your $PATH just be right within certain directories. You can tweak little details about your environment, and they get cleaned up afterwards outside those directories.
Way less weird breakage than other tools in my experience, and has lead to me being able to really just embrace 12 factor.
A bit of an unfortunate thing though is that I have been moving more and more away from xonsh. I really like writing Python/subprocess snippets, but I have just hated having to deal with tools that won’t play along, and ultimately I can write shebang scripts and basically get what I want.
I’ll never really be satisfied with bash scripting, and I can’t even say “at least it works” cuz it doesn’t, but I can do one thing half well pretty quickly.
Nice job! I agree with a lot of what your calling out. This is a great example of the result when someone continuously improves their daily work. Thanks for the honest and transparent take on how you got here after 10 years.
Bonus that youre using stow too, I love that tool.
A nice list of tools, but the custom tool running could be replaced with pgrep -af <proc-name>, and murder with pkill <proc-name>. Both, pgrep and pkill should be available on most Linux and Mac machines.
pgrep -af <proc-name>
Like the author, I’ve built up some aliases over the years (I see the first commit was early 2014). It’s always fun to discover blind spots when you have to polyfill a script to share it (nix can help with this sometimes).
My dotfiles are somewhat an aesthetic endevour. It’s a place where I can make whatever hacky helper I like, a reminder of computers as an outlet for whatever I feel like (following up on curious thoughts of being able to do something that might feel nonsensical).
The end result is often delightfully horrifically hacky. It’s home, broken in the ways I expect, instead of in ways that surprise me.
This is really interesting to read, my setup feels against the grain here (and I’m going to need to try some of these..). I split up my dotfiles where possible in multiple sourced files (ex zsh, tmux, vim, …). Each of these has a separate repo, so their histories are localized.
I find that many programs let you use conditionals for different environments, and that is how I handle different os types (mostly by uname output).
I wrote a little shellscript that parallelizes updating these repos (if they are present) with xargs. At home, saltstack drives my setup (linux, bsd, win, darwin). Work is much more restrictive so I have shellscript driving the initial setup there.