I have a desktop and a laptop and I’d like to keep some code in sync between the two. How do you handle this?
I use git, but I feel bad when I do a lame commit just to pick it up on another computer.
That’s what a WIP branch is for!
+1 on the guilty feelings about doing a commit just to move to another computer.
You can directly use git between two computers with ssh access between them without pushing to your origin.
Another vote for syncthing.
It gives you a nice degree of control of what to sync (.stignore) and it’s reliable.
It’s also pretty lightweight (compared to, say Dropbox) and fairly good at resolving conflicts.
How I use git:
I do this for ALL changes and it works well. I have a shell script with 6 lines in it that automates some of it.
It’s annoying to forget to push as well. One nice thing about rebasing is you can clean that kind of stuff up later on, but it does feel a bit like “I am having to do this for incidental reasons”
Given how the internet is nowadays, I bet a lot of people could get away with working via NFS or something for their code
I use Syncthing, too. I love it because it’s super reliable and easy to setup. I have to sync some stuff between Linux, macOS and Windows machines and I just can’t describe how happy I was the day I found out about SyncThing and could forget all the little gimmicks and flags that Samba needs so that files named “like this, brø” don’t end up called “ï»¿Ø§Ù„Ø¥Ø¹Ù„Ø§Ù† Ø§ÙØ¹Ø§Ù„ when I copy them.
Fingers crossed I’m not summoning little Bobby Tables here…
I use Syncthing for basically everything other than code, where I use the git-based workflow that I mentioned elsewhere in this thread. I don’t think I have any principled reason for this, other than expecting to work with code using git-based tools and not use those with other sorts of files.
Why? Committing every time you have any reason to of any kind is good! There’s a reason git commits aren’t forever and you can amend or rebase or reset later.
Just going to drop this here in case it interests anyone https://github.com/Emiller88/dotfiles/blob/master/modules/services/syncthing.nix
git or rsync. Or, if I don’t have either of those two on the target for
whatever reason, git archive |ssh foo@somehost tar x
I don’t think committing partial work is a bad thing. Commit early,
commit often. I’m uncomfortable with it because of years of
socialization, but I know it’s the proper attitude. Yeah, I’m also the
guy who thinks “fix fuckup” is a fine commit message, though maybe not
in published repos. Or maybe it is? I think being commit-shy and
wanting to have the oh-so-perfect jewel of a commit has caused me to
lose work over the years more often than not.
I don’t think committing partial work is a bad thing
I don’t think committing partial work is a bad thing
It also isn’t a good thing. But then there’s branches, and this is a good reason to use them.
I moved my development environment into a VM on a server and exposed it via ssh. I do all my development remotely, whether on my laptop or desktop, inside VS Code using the Remote SSH extension. No code to sync!
The VM is only exposed on a private Zerotier network so I can even open ports if I don’t want to use ssh port forwarding for some reason. This works VERY well and I’m happy with it, and the only thing that isn’t perfect is the wake-from-sleep reconnect which can take time to connect.
I used to just have everything in git but that was a nightmare.
I do something very similar. My code lives in a dev VM. Whether that’s on the computer in front of me or a remote one doesn’t matter much, except that it means I’m always building on the 10-core Xeon, even when I’m using my laptop.
At some point, I will probably move a lot of this to cloud VMs, because there’s no real advantage in having it local half the time and so there’s no point having a powerful desktop in my office.
Do you have any special setup to handle secrets (ssh keys, tokens, etc)?
Nothing other than ssh-agent.
I have done something similar for a while: just ssh’ing into a server and use a teminal editor like vim/kakoune/etc. This worked better than expected, but you really need to have a stable internet connection for that. Sometimes I work while traveling by train and the occasional lag was bad enough to switch back to syncing with git.
I use git. I’m not perfectly happy with it, but it seems like the best compromise.
Oh, it’s OCaml and GPL3. That’s neat.
That’s also what I use! I use it in a star topology where my desktop/laptops/etc. each sync with a cloud VPS.
Unison all the way. Has worked brilliantly for years. 600,000+ files totalling over 100GiB
90% of the time: git commit on WIP branch, 100% if it’s a private repo
10% for some reason I really don’t want this to be public: rsync.
But as I saw this later:
but because I’m switching between the two in the same house
but because I’m switching between the two in the same house
Very often I just have the code on my home server and then I edit on that machine anyway… be it via ssh or VS Code remote, so the answer would be: I don’t keep it in sync between commits.
git doesn’t require or encourage anything to be public and will work basically anywhere rsync will.
Yes, but that’s not the point. This was not a recommendation, but a description of a workflow.
I use Sourcehut myself; specifically, I pay for an account on git.sr.ht.
On the very rare occasion I’ve wanted to work on code on two different machines without wanting to commit to Git, I’ll just SSH from one to the other.
Instead of syncing pain, I’ve replaced my desktop with a laptop and a thunderbolt dock.
I run a personal Gitea instance with most of my personal projects code running on a very cheap VPS. So I just clone repos I’m working on from that instance and git push/git pull as needed.
Synching, with some filtering to exclude files that include local paths (eg. Virtual envs, PyCharm config files).
In my case, I also sync to a NAS, which is always on, so the desktop and laptop don’t have to be on at the same time. This is useful if I work on the desktop then I have to leave home, because I can turn off the desktop and then turn on the laptop at my destination and I will still get the updates.
I had this question on IRC just last week for the exact same reasons.
syncthing seems to be what solves this issue, but I’ll tell you what’s been working great for me: sshfs. Give it a try.
Filesystem-level git push/pull to a directory managed by syncthing (previous dropbox).
Edit: I don’t work in that dir, it’s a --bare repo used only to push/pull to/from.
Same setup here. I’ve heard arguments against it, so don’t try to conflict the repos on purpose.
Git. GitLab running on Docker Compose on openSUSE Tumbleweed on Hyper-V on my workstation. VM stays off, usually.
I add a /etc/hosts line so gitlab.mydomain.com maps to 192.168.&c to make that part easy. SSH key setup isn’t too bad, once you remember how to do it :P .
Similar setup, I have a windows computer and a mac laptop and just use Git.
I was using a remote desktop app called Parsec for a while and that worked well but now it doesn’t work for some reason related to the network…
And for non-code things I use creative cloud’s sync’d directory feature.
Dropbox, all day every day.
I also use Git, but since it’s a distributed version control system I just add all the machines I care about as remotes and push / pull whenever appropriate.
eg, if I’m refactoring my NixOS config and want to test the changes on some machine other than the one I’m writing the code on, I’ll commit the work-in-progress and push directly to that machine. When I’ve done all the testing I want, I’ll clean up the history, push to a branch on GitHub, open a PR, do a final sanity-check review, and then merge it.
Git. Either via the project’s repo, or an additional remote that connects directly to the other machine by ssh.
Fossil. The built in UI and simplified commit workflow are great for small personal use. While the git add / commit / push dance may have value for composing clean commits on a large repo, it’s really annoying for my own stuff. Fossil commit does it all in one go (with autosync on, the default).
I also have systemd timers than run fossil update on all my servers hourly.
I have these aliases:
alias fdc='f diff | colordiff | less -R'
alias ftl='f timeline -F '%n[%d] %c%n' -n 50 -t ci -v | less'
I use fossil too!
I do suspect that outside the lobste.rs demographic most people just use the built-in ‘cloud drive’ from their operating system. (Some use Dropbox or g drive - but you wouldn’t recommend them)
Dropbox. I probably shouldn’t be putting git repos in there.
Dropbox does get confused. When it does, it make a copy of the directory it’s confused about and puts a date on it and says which machine it’s from. It does generally work though - and I’ve not lost anything - just had to sometimes figure out myself which version I want. I stopped used it to sync git repos that I actively use though!
Use git. In-progress commits are fine. If you’re worried about the commit history, you can always squash it after the fact.
If it’s actual code and not some super secret personal files, just get a free private repo on for example GitLab (no affiliation).
This would work, but because I’m switching between the two in the same house fairly often, pushing and pulling git repos is a bit more ceremony than I’d like.
No need to push, set your computers up as remotes for each other and just pull
I’ll have to try this out!
How often are we talking? I have the same setup and I don’t find it too bothersome. It probably takes about 30s each to push and pull; plus I get the added bonus of knowing my changes are synced with a remote too (sometimes small justifications like these are enough to trick me into not complaining).
If I may suggest a kind of out-there alternate: you can ssh into the desktop. This was born out of necessity for me since my laptop is old and I was building a Big Repo. On a local network I don’t notice any lag, although there is the added caveat of ensuring your editor plays nicely with ssh.
The frequency can be random and switches must sometimes be done quickly to avoid children banging on my keyboard 😂
I run a Gitea instance on a NixOS VPS.
git add .
git commit -m "tmp"
git checkout origin/branch
git reset HEAD^
Just make sure on the laptop or desktop you also create a bare repo. Then push to or pull from bare from either desktop or laptop. Probably put bare repo on desktop. If your working on desktop, you don’t have to keep your laptop on if you don’t want.
code on git, everything else that need syncing in dropbox ( e.g. paperwork, org mode files, work docs, papers ). The only weird middleground for me are dotfiles, I use chezmoi but am not too happy with it, I use it so rarely that I never remember what to do. so most of the time i just cp/ln from dropbox.
ps. Beorg and sometime organice for org on the phone
I have a lil home server which (among other things) hosts NFS, which I keep most of my code, documents, music etc on. If I’m away from home I’ll either just copy stuff to/from the NFS mount ahead-of-time or VPN into my home network and continue using the NFS, depending on whether I’m expecting to have decent internet access.
Rsync for local machines and rclone for cloud like google or mega
For me, it depends what I am synchronizing:
I used to use Magic Wormhole, but Croc had some features (transfer
directories and resumable transfers) that sold me on it, although it
doesn’t preserve timestamps (I believe there’s a Github bug open for
this), so I’ve pretty much switched over. I keep wormhole for the ssh invite feature, which croc doesn’t have.
Gitea (locally), GitHub/Lab, and if it’s not git, I try to use rsync or a cloud file store (Gdrive, S3, etc.)
I used to have two machines with the intent that I do everything on the Desktop linux box locally or from remote but of course it never really works out that way and ended up having code/projects on both machines.
I learned a long time ago, one machine for everything. I got a new laptop (and repurposed the desktop to be a SteamOS machine) and am using it for everything. When I want that desktop experience I just dock it.
I used to use various additional tools for syncing between machines on top of whatever VCS I was using at the time. They mostly worked, but introduced complexity that eroded my confidence in my workflow. For example, I occasionally ran into conflicts when I tried to edit files in the midst of a big sync. One day it dawned on me that I was using two tools with mostly overlapping purposes on the same files.
I’ve long since gotten into the habit of pushing WIP commits to feature branches and squashing them before opening a PR or merging. One less piece of software.
Unison. I tried syncthing, but it wasn’t able to handle the workspace I had at my previous job. My current one… maybe; I’m working with way less code now.
I use unison to sync most files nightly. I also have a script https://github.com/nolanl/rp to sync things in a given git repo while I’m working, so I can use my slow laptop to edit files locally, and then run builds and tests on my much faster desktop. Works pretty well.
It would be cool if there was an easy way to send your git stash to another machine.
It’s for precisely this reason that I favor nested branches:
git checkout -b [feature]/[what I might otherwise put in a short stash message]
…over stashes. It leaves open the option to push whereas stashes do not.
I don’t think I’ve ever used stash messages. I only use stash/pop when I’m fixing something.
I have a Raspberry Pi that I use for all my sync needs.
Sync code? Git, of course.
For anything non versioned, rsync.
And for backups, b0rg.
I use a thumb drive for my home dir