Adding remote ssh access isn’t any more difficult. It just works. user@host:git-repo.git accesses a git repo in (home of user)/git-repo.git directory.
and the chance of losing work/commits goes to basically zero using a remote ssh repo. (unless you don’t make backups, obviously)
Syncthing + Git seems a little bit overlapping? Don’t you think? Instead I would just create git account and use it via SSH.
You don’t even need a “git” user ala gitolite if you only need one user. You can ssh as yourself and it all works.
while i also have remotes i access via ssh, that method depends on you be able to reach a device via ssh. what if it’s offline?
syncthing provides local copies everywhere. there are definitely big caveats to that, but i think it’s an advantage overall.
Maybe I’m confused, but isn’t the point of Git being distributed that you have a copy of the data even if the remote is down? You shouldn’t need access to the remote to access your data.
you are not confused.
i guess if remote is down, and workstation a is not up-to-date, you could pull from workstation b if it is up-to-date.
maybe i just get use out of this because i am lazy. but i find it comforting to know that there’s this directory being synced, that is not my working directory, and it lives in many places and is hard to destroy.
That makes sense. I might just enable SSH on every machine and push to all of the others when I have a change, but your solution certainly works (and is probably more automated, and allows for downtime more nicely). It’s nice that there’s a variety of ways to achieve this with different trade-offs.
If you’re working with trusted staff (i.e. you don’t need to prevent malicious/stupid acts) you don’t need a git user account for collaboration either. Filesystem ACLs solves the ‘files created with the wrong permissions’ issue.
I’ve been a bit busy and haven’t as much time as I’d like to clean it up, but I’ve got a reasonably tiny, fairly easy to use git remote called gitdir with no extra deps outside the binary and a git binary.
If anyone’s worried about gitolite (or similar solutions) because you need it to hook in with your main ssh server or hosting git repos in dropbox because it’s unknown how it interacts with syncing (I believe #git on freenode has a warning about this exact scenario), this may be a good alternative option.
Looked at mutagen at all? https://mutagen.io/
This is appealing largely due to low buy in.
I have a set of repos on my laptop. I have a set on my desktop. They are different sets with some intersection. I have no discoverability from one into the repos on the other. I only have backups or duplicated data if I make it explicit.
My desktop and laptop share a folder. It even is shared to my NAS (with different retention policies). Instead of building the right git hosting, I might as well use this mechanism and solve all my problems. Now I don’t have to try and set up my git repos, they are already there for me.
Hmm, slightly related:
Does anyone have something preconfigured to just pull all git repos from a github/bitbucket user/org for backup? Especially the “grab all repo urls via api” part, of course the rest is just a loop. But maybe there’s some dedicated small tool that’s better than a bash script?
For pulling all repos from a GitHub organization, there’s no easy way to do it but last time I needed to this blog post helped me out. IIRC GitHub has changed their repos/page limit to ~100 since that was written so double check you don’t miss any of them.
It would be nice if there was a more universal utility that did it for any Git server but you’d need to implement each service’s API since I don’t think there’s any way to get this info from within Git. The universal way would be to allow a real shell over SSH (so you could ls the home directory that holds all the Git bare repos) which no provider is going to allow for security reasons.
Once you have the list of repositories you need you can use myrepos to keep them all in sync though. Bonus points cause myrepos supports VCS other than Git.