Skip to main content


What is you backup tool of choice?


I don't mean system files, but your personal and work files. I have been using Mint for a few years, I use Timeshift for system backups, but archived my personal files by hand. This got me curious to see what other people use. When you daily drive Linux what are your preferred tools to keep backups? I have thousands of pictures, family movies, documents, personal PDFs, etc. that I don't want to lose. Some are cloud backed but rather haphazardly. I would like to use a more systematic approach and use a tool that is user friendly and easy to setup and program.
in reply to dustyData

Timeshift is nice to make things easy. I simply use good old-fashioned rsync tied to a cron job.
in reply to Meow.tar.gz

This is the way. A few test runs with non-critical files is always highly suggested to make sure you've got your syntax right.

gensens doesn't like this.

in reply to Meow.tar.gz

So, just today actually, i wiped ubuntu and isntalled pop_os with btrfs. Basically using this walk through and setup Timeshift to manage snapshots.

https://mutschler.dev/linux/pop-os-btrfs-22-04/

but thats not really a backup.

I have a backup box i use for files with rsync and the like. Need to figure out a full backup method to may backup location though.

Might just setup an ansible deployment and call it a day.
This entry was edited (11 months ago)
in reply to Freeman

I have to say that I used to be a timeshift fan but I’ve started moving to snapper instead. Both are very similar but with snapper you can have multiple configs, one per sub vol. each with different settings. I like having a separate root and home schedules set up. Means I can restore one or the other independently. Works a treat.
This entry was edited (11 months ago)
in reply to Lemmyin

Nice. I’ll check it out for sure. That post I followed also i a link to the authors scripts to run a btrfs snap before apt runs.

Frankly I just moved some configs over before I did the wipe. My Linux desktops aren’t too customized.

I had to work around his how to a bit since I use nvme and a pre-partitioned disk that I had to pre-format lvm to (he used a default install run to pre-format the disks)
in reply to dustyData

I use back in time. It's served me well for quite a few years.
in reply to dustyData

I used to be mostly restic but I've since moved over to Kopia - having the central server on the nas and shipping those files to B2 is easy enough for my level of laziness.
This entry was edited (11 months ago)

Salix doesn't like this.

in reply to dustyData

For personal files I use Borg (with Vorta) and/or Restic
in reply to dustyData

I s3 sync everything to a versioned S3 bucket out on the internets.
in reply to dustyData

Restic and borg are the best I’ve tried for remote, encrypted backups.

I personally use Restic for my remote backups and rsync for my local.

Restic beats out borg for me because there are a lot more compatible storage options.
This entry was edited (11 months ago)
in reply to SymbolicLink

Switched to Restic because then I don’t need any extra software on the server (Synology NAS in my case).
in reply to dustyData

Kopia repo on a separate disk dedicated to backups. Have Kopia on my servers as well sending to my local s3 gateway and second copy to wasabi.
in reply to sneakyninjapants

Wholly off topic.

I feel like you should know about this if you don't already.

redshift doesn't like this.

in reply to dismalnow

Not trying to out myself, but I may be one of the few people that actually owned that shirt lol
in reply to sneakyninjapants

Here we are.. fuzzing the scrapers.

I may have one on now.
in reply to dustyData

I have no relevant data locally. My Documents is a symlink to a Nextcloud directory running on my Synology NAS on a RAID1 that backups to cloud storage via one of their tools (forgot which one).

I never liked having to backup working machines. If it breaks I'm fine with having to install again. I won't lose data though.
in reply to dustyData

I do 2 backups

Veeam system image daily; this is a fully bootable image of every drive on my system, kept for things like hardware failure or "oops" moments. It just goes to my NAS for fast local storage.

Online backup of important files daily; this has changed a few times, I was using Restic to B2, then Duplicati to Wasabi S3, now I'm using iDrive to see how that is.

My favorite tools are definitely Veeam and Duplicati, because they both have a good UI and are easy to use, both automatically run in the background and handle scheduling entirely on their own. Browsing snapshots is easy and finding the files you want at a specific date/time is quick.

Restic and Kopia I've used as well, they're much harder to use especially for restores, finding files is a nightmare via CLI. Scheduling is a pretty involved step, and you have to figure out how to run them in the background yourself. Both also performed really slowly for me on my ~3TB backup set of about 50k files, compared to Veeam and Duplicati which are very fast.
This entry was edited (11 months ago)
in reply to MangoPenguin

I’ve found Restic great once dialed in. I have a systemd service run backups automatically. Super fast thanks to only backing up diffs; only the initial backup is slow.

Yes making a script and service isn’t for everyone.

Finding files in the backup is easy… you just mount the backup and search any way you want, just like any other directory. Not sure why that’s hard?
This entry was edited (11 months ago)
in reply to MangoPenguin

+1 for Veeam. I am a backup administrator and this is our tool of choice. I use it for my home machines as well and it works great.

Just remember, you don’t have a backup unless you have tested it.
in reply to dustyData

I just use MegaSync, which backsup my config folder and documents folder.

On phone, I use syncthing to backup to home server (I never knew syncthing can backup over WAN), then synced to MegaSync. I also keep all the files on MegaSync on my server just in case megasync suddenly goes down one day.
This entry was edited (11 months ago)
in reply to dustyData

Truenas on a inexpensive server with RAID. I have several computers in different rooms in the house I like to make music on, and on these pc's my network drives all have the same drive letters for the sample libraries, recordings, projects, and backup. So my projects can run from any computer without missing files. I always save locally and on the Truenas.
in reply to dustyData

I just map my documents, pictures and other important home folders to subfolders inside Dropbox. This propagates all of my files across all of my computers and makes everything accessible from my phone as well.

I don't worry about backing up my operating system, though important configuration file locations are also mapped into Dropbox for easily setting things up again. Complete portable apps are also located in Dropbox.
in reply to dustyData

I almost never see rdiff-backup in such threads, so I am bringing it up now. Somehow I really like how it works and provides incremental backup with folder structures and file access still accessible directly. Works well enough for me.
in reply to OptimisticPrime

I love rdiffbackup.

I use it to backup a 30 TB array and it completes in like 20 minutes if there are no changes.
in reply to average650

There's dozens of us! I started using it while I wrote my thesis, running a backup like every hour while writing.
in reply to OptimisticPrime

Absolutely - rdiff-backup onto a local mirror set of disks. As you say, the big advantage is that the last "current" entry in the backup is available just by browsing, but I have a full history just a command away.
Backups are no use if you can't access them, and people really under-rate ease of access when evaluating their backup strategy.
in reply to dustyData

KDE user so for my personal files I backup with both Kups and Bups (install both) and you get the choice of cloning type or only changed files with going back in time choices. Integrates into KDE taskbar/system settings.

For redundancy, I back up my main sync folder on the desktop to my laptop using Syncthing over my WiFi/network.
This entry was edited (11 months ago)
in reply to dustyData

Duplicity over SSH to my backup NAS, which then backs up to a cloud service iDrive weekly.

My phone and tablet are both Samsung, which uses OneDrive for backups
in reply to dustyData

I use boring old zfs snapshot + zfs send -i.
It's not pretty, but it's reliable.
in reply to dustyData

Dejadup backup is neat if you need a GUI. But TBH, you really don't need a GUI, restic will work just fine as long as you target a few folders. It mostly boils down to file/folder hygiene.

firrann doesn't like this.

in reply to dustyData

I've used a combination of
  • Managing ZFS snapshots with pyznap
  • Plain old rsync to copy important files that happen not to be on ZFS filesystems to ZFS.
If I were doing this over today, I'd probably consider https://zrepl.github.io/ instead of pyznap, as pyznap is no longer receiving real active development.

In the past I've used rdiff-backup, which is great but it's hard to beat copy-on-write snapshots for speed and being lightweight.
in reply to dustyData

Well it was duplicati, until it pulled this bullshit on me. I had a critical local failure of my data a month ago, 2.8TB lost. Pulled the backup off AWS S3 with my linux server, asked Duplicati to restore it, and it's failed 4 times for random reasons, taking a week to get there each time. Once I can get this backup to finally restore, I'm moving over to Duplicity.
in reply to kunic

Stuff like that is why I ditched duplicati. I had to rebuild the local db that would randomly corrupt itself one too many times.
in reply to 𝓢𝓮𝓮𝓙𝓪𝔂𝓔𝓶𝓶

Exactly where my failure is. It's corrupting mid-way through the rebuild for no apparent reason.
in reply to dustyData

+1 rsync, to an external harddrive. Superfast. Useful also in case I need a backup of a single file that I changed or deleted by mistake. Work files are also backed up to the cloud on mega.nz, which is very useful also for cross-computer sync. But I don't trust personal files to the cloud.
in reply to stravanasu

Don't forget that a local backup is as bad as no backup at all in the case of a fire or other disaster. Not trusting the cloud is fine (though strong encryption can make this very safe), but looking into some kind of off site backup is important. Could be as simple as a second hard drive that you swap out weekly stored in a safe deposit, or a nas at a trusted friends house.
This entry was edited (11 months ago)
in reply to omeara4pheonix

Completely agree! I didn't mention this, but I keep the back-up hard drive in another apartment.

This reminds me of a story that happened in some university in England: they had two backups of some server in two different locations. One day one back-up drive failed, and the second failed the day after. Apparently they were the same brand & model. The moral was: use also different back-up hardware brands or means!
This entry was edited (11 months ago)
in reply to stravanasu

3 2 1
3 different backups
2 different mediums
1 off-site

Haven't seen that not be good move yet.
in reply to dustyData

I use timeshift for local backups, then duplicati for backing up to Amazon glacier monthly.
in reply to dustyData

Time shift with rsync, and on occasion I clonezilla the drive and save it to my NAS.
in reply to joel_feila

I use this and then for each 2 weeks rsync to my cold storage. Some data I also use rclone bisync to backup to cloud, in case I need it so bad, when I'm hitting the road.
Unknown parent

Quazatron
BorgBackup is backup done right. Compressed, deduplicated, encrypted. After the initial backup, it takes only a few minutes to do a new backup. Need a specific file you deleted last week? Just mount a previous back and get the file back. It is that simple. Love it.
in reply to dustyData

I almost never see FreeFileSync mentioned in those threads. It's the only GUI based app I know that also gives you options to not copy file deletions for example. Also has the option to be automated with crontab. Backups are not fragmented or repackaged so you can browse them just fine. Encryption can be done with Veracrypt.
in reply to dustyData

grsync, its easy to use
in reply to dustyData

External harddrive, drag&drop.
in reply to dustyData

I’ve recently started using proxmox -backup-client. Works well. Goes to my backup server along with my vm image backups. Works nicely with full deducing and such. Quite good savings if you are backing up multiple machines.

I the. Rsync this up to cloud once a day.
in reply to dustyData

in reply to philipstorry

⬆️ for rdiff-backup since it keeps the last backup easily readable.

I had before (and I think I'll implement it again) snapshot capable filesystem where to I rsynced my stuff. Then once a day did a snapshot of the backups. It has the advantage of all the backups being easily readable as long as your backup filesystem is intact and your kernel can mount it.
in reply to dustyData

I am using Borg for years. So far, the tool has not let me down. I store the backups on external hard drives that are only used for backups. In addition, I save really important data at rsync.net and at Hetzer in a storage box. Which is not a problem because Borg automatically encrypts locally and for decryption in my case you need a password and a key file.

Generally speaking, you should always test whether you can restore data from a backup. No matter which tool you use. Only then you have a real backup. And an up-to-date backup should always additionally be stored off-site (cloud, at a friend's or relative's house, etc.). Because if the house burns down, the external hard drive with the backups next to the computer is not much use.

By the way, I would advise against using just rsync because, as the name suggests, rsync only synchronizes, so you don't have multiple versions of a file. Which can be useful if you only notice later that a file has become defective at some point.
in reply to dustyData

GNOME Disk Utility for backing up the whole hard drive. Otherwise, I use BackInTime.
in reply to dustyData

At this moment I use too many tools.

For user data on my PC and on home server I mostly use Duplicacy. It is fast and efficient.
All data backed up locally on NAS box over SFTP, and a subset of that data is backed up to S3 cloud storage.

I have a Mac, this one is using TimeMachine, storing data on NAS, then it's synced to S3 cloud storage one a day.

And on top of that VMs and containers from home server are backed up by Proxmox built in tool to NAS. These mostly exclude user data.
in reply to dustyData

An external hard drive works 100%. And relying on .dotfiles to redownload the whole thing back.

...I mean, it takes like less than 3 minutes to redownload and 5 reconfiguring everything manually, so eh.
in reply to dustyData

Restic in the homelab and Veeam at work. I’m pretty happy with both!
in reply to dustyData

Git for projects, NAS for 3D printing stuff, mods for games and unofficial game translations, Google Photos for photos (looking to migrate away from that when I have time). I don't much care about anything else.
in reply to ErwinLottemann

Git for projects

I assume the original comment meant code based projects, for which git, if repo is pushed to a remote, is a very sane choice.
in reply to bellsDoSing

Yeah, git without LFS isn't optimal for non-text files.
in reply to bellsDoSing

Yep, that's what I meant. If it's a public project, it's on my GitHub, if it's a private one, it's on my private GitLab instance.
in reply to ErwinLottemann

Meaning that as long as you're regularly committing your work to Github/Gitlab/wherever, you don't need to backup your source directory.
in reply to dustyData

I use Raspberry Pi 4 with connected external HDD and installed Nextcloud
in reply to dustyData

Syncthing. I don't want to invest into a NAS and put some load into my already greedy power bill, so I chose something decentralized. Syncthing really just works like Torrent but for your personal files: Whatever happens on the computer, also does on the phone, and on the laptop. Each have about 1TB of space and 3 times redundancy? Hell yea buddy dig in.

cals11 doesn't like this.

in reply to denny

I just found out about syncthing yesterday and it really is superb, it's so easy to use even crossplatform. unison is another syncing tool that I like, I find it better for bidirectional syncing
in reply to denny

But that's not really backup, is it? It just synchronizes folders.
in reply to nis

Yes but it is a automated backup solution if you want it to. I just put important stuff in the Syncthing folder and rest assured its also on the phone incase the computers SSD caughs fire.
in reply to denny

I think you are confusing synchronizing with backup. If you delete a file in your Syncthing folder and the deletion gets synchronized, that file is lost. If you do the same in a folder backed up by, say, Borg, you can roll back the deletion and restore the file.

I may be wrong about Syncthing, though. I haven't used it yet, but will probably use it in the future. Just not for backup 😀
in reply to nis

This is true if you leave it at defaults but I make use of file versioning. When you flick that one on, files that are otherwise replaced or deleted will actually move to a offline .stversions folder. That is very vital I must say in case a host catches some encryptor malware eheh
in reply to denny

I didn't know that was a possibility. Still, it seem kind of not really what Syncthing is intended for. I mean, they even state it in their FAQ:
No. Syncthing is not a great backup application because all changes to your files (modifications, deletions, etc.) will be propagated to all your devices. You can enable versioning, but we encourage you to use other tools to keep your data safe from your (or our) mistakes.
in reply to dustyData

I have hundreds of thousands of files that need to be backed up locally and in the cloud. I use either Vorta or Pika. Both are interfaces for Borg. Easy to use and their deduplication feature manages to save a lot of diskspace. I tried so many backup solutions and none worked as reliably.

Linux reshared this.

Unknown parent

ScottE
I do this as well. Easy and inexpensive.
in reply to dustyData

Restic (local repo) which I sync onto a Hetzner Storagebox using rclone.
in reply to dustyData

Borg backup (via Pika Backup (Libadwaita gnome app) frontend) to one of my physical drive and also to borgbase.com (free tier 10 gb free)
This entry was edited (11 months ago)
in reply to dustyData

Vorta (Borg GUI). It's simple to use.
This entry was edited (11 months ago)
in reply to dustyData

rsync (laptop -> external HDD, workstation -> dedicated backup HDD)
Syncthing (laptop <-> desktop)