My Approach To Ubuntu Desktop Backups

What I’ve Been Using To Keep My Desktop Safe From Accidental Deletion, Disk Failure, and Other Forms of Destruction

Daniel Rosehill
Oct 8, 2020 · 10 min read
My workstation in all its glory. The desk was custom made and is two meters wide.

I recently celebrated my Linux desktop’s third uptime birthday (geek to normal person translation: how long the operating system has existed without requiring installation).

In light of both Linux’s overall bugy-ness, at least compared to Windows, and my fondness for making changes to it, that’s a bigger feat than it might at first sound.

And without any question, the key to my desktop’s relative longevity has been the implementation of a backup strategy.

A little over three years ago, I promised myself that I would never face the time-consuming annoyance of having to install Linux from scratch again. At least by necessity. And so far so good.

Being able to focus on running my business without the periodic annoyance of having to start my operating system from scratch has been nothing short of game-changing. Two steps forward … and … just two steps forward.

Although I’ve written plenty about Linux backups before, I thought I would take the chance to provide another quick overview of what works for me. But this time with a shopping list so that everything I use protect my system from accidental deletion, power surges, and OS-bricking updates is here in one place.

I don’t pretend that this is the best Linux backup approach that’s ever been documented. I don’t doubt or deny that there are other approaches. But this is one that has worked for me. So I thought I would share.

Here’s to stable Linux desktops!

(Although I’m a desktop fan this approach could be easily modified for laptops)

Shopping List

My hot spare in its storage case

HARDWARE

  • 1 x extra hard drive / SSD for storing Timeshift restore points. If you’re using a laptop, you could store the backups onto the primary drive itself. But the best practice is to store the backups on some other physical medium. An external SSD would work. Although you’d need to connect it regularly. In terms of capacity, it would be reasonable to make this drive the same capacity as your primary drive or greater. Storage is (relatively) cheap and if you’re prepared to invest in protecting your system there’s no reason to be parsimonious about the size hard drive you buy. Approximate cost: $75.
  • 1 x extra hard drive / SSD for storing Clonezilla disk images. Yes, you could simply cram these onto the same extra hard drive as above. But using two disks decouples the backup drives and by extension the backup approaches. In other words, if one goes down, the other backups will live (without even having to resort to restoring from offsite copies). Ditto regarding the storage capacity and the cost. $75.
  • 1 x extra hard drive / SSD to keep a hot spare on. Remember what I said about not economizing on backups? The purpose of the hot spare is to have something to keep you going in the even that you experience disk failure. That’s because provisioning new storage is going to take time. You’ll probably need to visit your local computer shop to buy a new SSD at a minimum (wait, should I be keeping a spare hard drive and a spare SSD on hand?). To tide you over while you get that sorted, the hot spare is a usable and reasonably well updated copy of your system that you can stick in your PC and run. I don’t envision this will ever be needed, so mine’s a hard drive (even though my system uses SSD). I duplicate my “production” (daily use) drive onto my hot spare using Clonezilla approximately once a year. This is a low priority backup administration task but one that, ideally, shouldn’t be skipped over to make recovery as seamless as possible. $75.
  • 1 x external SSD for storing in an offsite location. To be thorough about this (and 3–2–1 compliant) we’re going to need to mirror one copy of our backup offsite. Full disk images, which is what I offsite, are heavy. Too heavy for me to move up to the cloud with my rudimentary upload speed. So my offsite is an external SSD stored in the boot of my car. I update and rotate the backup copy monthly.
  • 1 hard drive enclosure or NAS. You’re going to need something to write the data to the disks with. I used to keep my backup drives internal — connected to my motherboard within my PC. These days, only my Timeshift drive is internal. I reckon that keeping the last resort backup, Clonezilla, isolated from power (and hence protected against power surges) makes more sense. At the expensive of proactive disk health monitoring (e.g. using an NAS for this purpose) I get power surge protection (it’s a cold backup not connected to power at all). Alternatively you could take the Clonezilla images to an NAS. I use the Synology DS920+. But that will be a lot pricier. Let’s factor in $50 for the enclosure.
  • 1 USB drive. You’re also going to need a USB drive to write Clonezilla too. $5.
If you’re weird enough to be using Linux, you’re weird enough to start keeping backup tapes in your car

You’ll probably also want a couple of shockproof antistatic cases within which to store the hot spare and the onsite Clonezilla copies. The cost of these is negligible so I’m not factoring them into my calculation.

TOTAL HARDWARE COSTS:

1 TB HDDs x 3 = $225

1 x external SSD = $75

HDD enclosure x 1 = $50

USB drive x 1 = $5

$355

Note, of course, that this is a one time capital expenditure (CapEx) rather than an ongoing operational cost (OpEx). To me, this is a small price tag to pay for keeping your desktop well protected at all times.

SOFTWARE:

  • Timeshift for daily system restore backups (note: Timeshift doesn’t support encryption but Deja Dup would be a good alternative — it backs up incrementally to an offsite).
  • Clonezilla for disk imaging / bare metal backups (note: Clonezilla backups can be encrypted).

TOTAL COSTS: $0

Implementation

A schematic of the backup approach

Set Up Timeshift (Backup System 1)

Onsite, Incremental, Daily

Now that you’ve equipped yourself with everything needed to implement this backup approach — all for under $360 (less if you’re in the US!) — it’s time to set everything up.

Firstly, install the Timeshift drive to your desktop.

Timeshift isn’t designed to work with external remotes. So the most effective way to use this amazing program is to run the incremental backups onto either a folder on your primary drive or a secondary drive (as I’ve opted for).

I could fit more restore points on my drive. But instead I’ve gone for as many as I thought it made sense to store.

These are:

  • 1 x daily restore point
  • 2 x weekly restore points
  • 1 x monthly restore point

You can also configure restore points hourly and on system boot. But the daily has been more than enough for me so far. A recovery point objective (RPO) of a few hours’ of computing is more than acceptable for me as I don’t add new programs and configurations to my desktop that often. But if you do, then an hourly snapshot might make sense for you.

Incremental Cloud Backup (Optional)

If you want, you can also set up a daily incremental backup to move a full disk backup offsite to the cloud.

This can be a block-level (vs. file based) backup.

If you run this every day, you’ll only be moving up the changes since you last ran the backup. Using a tool like MSP360 you can save this directly to cloud storage such as S3, B2, or Wasabi. Encryption can be enabled.

I tend to skip over this (fourth!) backup because in the event a restore were needed from offsite, it would probably be easier to do so from a bare metal image. Restoring from MSP 360 involves provisioning the operating system again.

Set up Clonezilla (Backup System 2)

Onsite, Full, Weekly

Firstly, we’re going to need to install Clonezilla onto the live USB.

Once that’s been taking care of it’s time to take our first backups!

Firstly, we’ll need to format the Clonezilla drive that we’re going to keep in our enclosure. I typically use ext4 as all my systems run Linux.

After it’s been formatted, take the first disk image / bare metal backup from the desktop.

Disk imaging is a full backup approach. It’s copying literally entire partitions, or drives, at a time. I use the drive to image option. It takes about 10–20 minutes to run. The result is a folder on the target medium containing compressed images. You can use Clonezilla to restore this onto the source you backed-up from or onto a new target entirely (e.g. post disk failure).

Note: You can choose to encrypt your backup.

Clonezilla Backup 2

Offsite, Full, Monthly

Next we’re going to need to repeat the Clonezilla backup but onto an external SSD that we’re going to keep offsite.

The details here matter less. We could use another internal HDD/SSD. We could back up straight to the cloud if we had a good enough upload speed. We could store a physical backup in the trunk of our car. Or in a friend’s house. Or we could mail it to a relative (if you’re doing that, it’s better to rotate two disks: you send an updated backup tape and your relative posts one back to you). Whatever option you go for, the essential thing here is to make sure that we’re storing one copy offsite.

Storing a backup tape in the car

This ensure that if things go really really wrong (house gets flooded or scooped up in a tornado), we’re still going to have a copy of our computer which we can use to restore from.

We’d just …. probably have bigger fish to fry first.

But once we’ve fried those fish, I think there’s a good chance we’d be thankful to have a relatively recent offsite copy of our desktop to restore from.

I take the offsite copy less regularly. So the RPO here is typically one month.

Set Calendar Appointments

The downside of this backup approach is that it’s pretty manual.

It’s hard to take bare metal / disk image backups through fully automated means although I’m sure that it’s possible.

If the above isn’t quite to your liking, then you could relatively easy automate an incremental offsite using Cloudberry which I have tried successfully to backup my system incrementally to AWS S3 and B2.

Personally, when I’m thinking about backups, I’m more concerned with how robust my approach is. This approach sacrifices some ease of administration in favor of making it easy to obtain a robust approach for data protection.

And I can’t think of a more solid mechanism for protecting a Linux desktop than Clonezilla. Clonezilla, in theory at least, can get you back from total disk failure without a RAID setup (and this is essentially why I use it in parallel to Timeshift). Buy new storage of the same sizes as your old one, run a restore, and you should be back in action.

Yes, my manual approach requires a bit of legwork. But you should have very reliable onsite and offsite backup protection of your Linux desktop. There’s no real magic to remembering when to take the various backups.

To keep on top of the backups that I need to initiate by hand (onsite Clonezilla every two weeks, offsite copy once a month) I set up a separate Google Calendar just for backup-related reminders:

And I set up appointments for the various jobs that I need to run.

Timeshift runs itself automatically whenever I use my computer. And the Clonezilla jobs are penciled into my diary:

Summary of Approaches

A summary of the approaches documented here with the fourth possible offsite included too

Feedback Welcome

This backup strategy is continuously evolving — and so I document it here and on Github whenever there are any significant changes to report.

The latest ones are keeping the onsite backup on ‘cold’ (non-power-connected media). And storing the offsite on SSD in a location I have physical access to (rather than the cloud).

Daniel’s Tech World

Technically-minded business ghostwriter Daniel Rosehill…

Daniel’s Tech World

Technically-minded business ghostwriter Daniel Rosehill offers some how-tos, opinions, and general geekery. Particular interests: Linux, multimonitor computing, GPUs, cloud computing.

Daniel Rosehill

Written by

Marketing communications consultant interested in tech, Linux, ADHD, beer, async, and remote work (in no particular order). RosehillMarcom.com

Daniel’s Tech World

Technically-minded business ghostwriter Daniel Rosehill offers some how-tos, opinions, and general geekery. Particular interests: Linux, multimonitor computing, GPUs, cloud computing.