1. 7
    The Essence of Backups in Linux linux pljung.de
  1.  

  2. 11

    This is terrible advice.

    If you want to go DIY, look into snapshots with rsync and hard links. It’ll be much quicker and take much less space. This method duplicates everything, every time.

    Not to be too mean, but I’m not likely to take ops advice from someone who flips the table and blows away their Linux install twice a year.

    1. 7

      Slightly less DIY: https://rsnapshot.org/

      1. 2

        Not to be too mean, but I’m not likely to take ops advice from someone who flips the table and blows away their Linux install twice a year.

        My reading of the OP wasn’t that he wasn’t intentionally blowing it away, but that some part of the upgrade process inevitably went wrong (ignoring which direction the blame fingers might point) and resulting in needing a fresh start. That’s why I make backups!

        If, on the other hand, he’s intentionally starting off with a clean slate twice a year, that’s just “modern”. There was a time when a machine’s uptime was something to brag about, but these days it’s become much less common to update a system, best practice is to just roll out a new version and discard the old.

        1. 4

          If, on the other hand, he’s intentionally starting off with a clean slate twice a year, that’s just “modern”. There was a time when a machine’s uptime was something to brag about, but these days it’s become much less common to update a system, best practice is to just roll out a new version and discard the old.

          Uh, I have been using the same Arch Linux install since 2010 on my desktop. It has survived 4 moves to completely different systems, and originally installed on a spinning rust SATA but now on an NVMe drive. Having to reinstall the OS for updates is not ‘modern’, it’s ‘archaic’ at best.

        2. 1

          This is terrible advice. If you want to go DIY, look into snapshots with rsync and hard links. It’ll be much quicker and take much less space. This method duplicates everything, every time.

          In the paragraph below I discuss why using cp is a poor solution. I advise readers to use Borg.

        3. 4

          The Tao Of Backup comes to mind:

          The novice asked the backup master which files he should backup. The master said: “Even as a shepherd watches over all the sheep in his flock, and the lioness watches over all her cubs, so must you backup every file in your care, no matter how lowly. For even the smallest file can take days to recreate.”

          The novice said: “I will save my working files, but not my system and application files, as they can be always be reinstalled from their distribution disks.”

          The master made no reply.

          The next day, the novice’s disk crashed. Three days later, the novice was still reinstalling software.

          I’ve burned myself a few times making assumptions about what parts of my filesystem I really need backups of. In the end, simply backing up almost everything has left me with considerable peace of mind. I still exclude some things from my backups (think /var/cache, /home/*/.cache and the like), but between blacklisting and whitelisting, the former seems preferable for backups. I’d rather have a few gigabytes of extraneous data in my backups than lose even a kilobyte of important data.

          1. 4

            About every 5 months I completely break my operating system.

            Wow. I didn’t manage to do that since Windows 95.

            In general, I’d use rsync over cp and this should be good enough already. I use backup2l for all my servers, and never had problems restoring.

            To the author: Why not put /home on a different partition and reuse that when reinstalling? Not that backups aren’t useful, but this would at least take some of the burden off.

            1. 3

              About every 5 months I completely break my operating system.

              Indeed, I was surprised. But then I read the footnote mentioning that OP uses a lot of external repos, which to my experience often don’t get scrutinized at all.

              Personally, I’ve got the same operating system since years (Fedora), and I systematically upgrade. I never had the need of reinstalling it.

              It’s true however that I keep it minimal and rely almost exclusively on terminal based application, so perhaps it’s just the simplicity of my setup.

              1. 1

                I’m not 100% sold on “upgrade all the time” but my x230 hasn’t been reinstalled, only upgraded with Debian since 2013. My other ThinkPad as well, but since early 2017. My current work ThinkPad I think I reinstalled when switching from Ubuntu 16.04 to Kubuntu 18.04.

                Also what is “a lot”? I used to have at least 5-10 external apt repos and not once I saw one breaking anything, but it was mostly end user software like Spotify or Firefox and not something deeply integrated into the system. Maybe I’m just not doing weird things but I really find it very hard to break a linux install these days.

              2. 1

                To the author: Why not put /home on a different partition and reuse that when reinstalling?

                I do not keep /home on a different partition, but I like the idea. As you mentioned, it doesn’t replace a backup, but it saves me a few minutes of copying files.

                1. 1

                  It happened to me when I used an application Ubuntu didn’t have a clean package for. There’s a software manager built-in, apt-get for stuff it doesn’t have, some things have different versions, and some apps do their own custom setup. In some cases, stepping too far away from first, two options or 2nd with current version can just clobber the system to the point all sorts of things fail. I just re-installed.

                  It was using Windows and Linux that sold me on OpenVMS’s concept of versioned filesystem at least for apps or important stuff. Today, the Nix concept of just fixing package management once and for all… with rollbacks just in case… looks like best option. I haven’t moved yet since these days I like OS’s I don’t have to tinker with which have tons of software ready-to-go. Idk where Nix setups are on that. I just do one or two installs a year, they’re fast, and I can use my smartphone instead of waiting on them. (shrugs)

                2. 3

                  My sister asked about backups, and I thought I’d share here my recommended solution, for basically everyone, because it’s hard to get wrong.

                  Buy 2 drives the same size as the drive you want to backup. I.e. if the drive you care about is 1TB, buy 2 1TB external drives.

                  1. Every day, do a backup of your computer to the second drive, this is your daily copy.

                  2. Every month, take the daily copy offsite(to a friend, to the office, to a safe deposit box, whatever). this is your offsite copy.

                  3. Every month, take the one offsite home to be your daily backup drive.

                  4. repeat step 1.

                  The one offsite is in case your house goes away, and is a backup of your backup.

                  If you don’t care about daily backups, you can make this a weekly thing. whatever.

                  If you are very paranoid, need backups more often, then you want to buy 3 identical computers, 1 for use, and 2 for backups. Have software like rsync(https://rsync.samba.org/) that backups up every 5m your data across to the 2 backup machines. 1 will be @ home, 1 will be offsite, and 1 will be your daily use. It’s the more expensive, more complete version of the example above. Or you can mix/match with the instructions above.

                  More advanced versions of either of the above are possible, including not using identical computers(which is what I do) but require some expertise.

                  macOS: use Time Machine(which uses rsync). Windows: Windows backup(included with windows) or rsync. Unix/Linux: use rsync.

                  Also, in general you should not “schedule” it, as nobody will ever verify it actually happened, then your backup will fail for some stupid reason, nobody will notice and you will be upset. Just make running it one button you push, and then you can push it as often as you want.

                  1. 4

                    As a general rule it’s frowned upon for most of your posts on this site to be self-promotion. So far you’ve submitted three stories, all from your blog. I’d encourage you to try to submit less often from your own blog and more often from other sources.

                    1. 11

                      I disagree. This is real technical content, and while it’s not especially interesting to me personally, I’m glad that it was submitted. I regard your use of the spam flag as inappropriate, because the post is not promoting any commercial product or service. I’ve upvoted the story to compensate, and I suggest you have a look at the guidelines.

                      1. 7

                        I flagged it as spam because:

                        1. this user is clearly promoting their blog (not contributing to lobste.rs in any other meaningful way other than, heh, spamming links to their own articles)

                        2. this is a very low effort post (“TIL about cp” and “go read about BorgBackup!’ (but no actual info about how to configure/use it).

                        I suggest you have a look at the guidelines.

                        Unfortunately there’s no clear ‘guideline’ for this type of crap. So ‘spam’ is (IMHO) the best fit.

                        1. 3

                          I didn’t flag this as spam, someone else did.

                          1. 0

                            Mea culpa. I do appreciate your having commented with your opinion. I think there are a few Lobsters users who flag stories they don’t like as ‘spam’, and it’s hard to respond to them if they don’t comment.

                          2. 2

                            As technical content it’s bad. It’s bad advice that for the most part should not be followed. At best it serves as a polemic.

                            1. 1

                              Having looked a little closer at the content, I agree – I just think that a brief comment (like yours!) is a better way to respond, both to the poster and to other readers, than a flag.

                        2. 2

                          For complete DR backup i would suggest REAR:

                          https://github.com/rear/rear

                          It consists of a 2 step backup concept.

                          In the first step it creates a minimal bootable image from your running system which can be used to recover the complete system layout with partitions, lvm volume etc. Thats pretty cool because during recovery you have basically the system running you used before with all the right toolset/kernel version etc. You can also use it to migrate your system to new hardware, which works in most of the cases.

                          In a second step it creates regular Backups via various backends which can then be used to recover the data on the volumes.