35
submitted 1 year ago by privsecfoss@feddit.dk to c/foss@beehaw.org

I am using duplicati and thinking of switching to Borg. What do you use and why?

top 50 comments
sorted by: hot top controversial new old
[-] Fryboyter@discuss.tchncs.de 11 points 1 year ago

There is no such thing as the objectively best solution. Each tool has advantages and disadvantages. And every user has different preferences and requirements.

Personally, I am using Borg for years. And I have had to restore data several times, which has worked every time.

In addition to Borg, you can also look at Borgmatic. This wrapper extends the functionality and makes some things easier.

And if you want to use a graphical user interface, you can have a look at Vorta or Pika.

[-] privsecfoss@feddit.dk 2 points 1 year ago

Agree. Should say 'best for you'. Cool thanks. I know of Vorta which I intended of using. Gonna read up on the other ones.

[-] lawliot@beehaw.org 8 points 1 year ago

I use restic. For local backups, Timeshift.

[-] ira@beehaw.org 2 points 1 year ago

Seconded, I use restic with a remote blob storage and works nicely

[-] CjkOvPDwQW@lemmy.pt 7 points 1 year ago

Using borg backup, just because there are some nice frontends for the gnome ecosystem (when I am using gnome, I love to use gnome apps), and it has a nice cmd for scripting when using something else (using it on servers)

[-] sudoreboot@beehaw.org 2 points 1 year ago

And there is a nice graphical frontend for it too: Vorta

[-] CjkOvPDwQW@lemmy.pt 2 points 1 year ago

Personally more of Pika Backup user ;)

[-] flux@beehaw.org 6 points 1 year ago

Kopia has served me great. I back up to my local Ceph S3 storage and then keep a second clone of that on a raid.

Kopiahas good performance and miltiple hosts can back up tp it concurrently while preserving deduplication -- unlike borgbackup.

load more comments (6 replies)
[-] mrmanager@lemmy.today 5 points 1 year ago

I don't have backups. :/

And I will regret it some day.

[-] exu@feditown.com 3 points 1 year ago

There are two kinds of people.
Those who make backups and those who will.

[-] yote_zip@pawb.social 2 points 1 year ago

You very much will. It's easier than you'd think.

load more comments (1 replies)
[-] derek@lemmy.one 4 points 1 year ago
  • Btrfs for local system backups based on snapshots
  • Photoprism for photos
  • Syncthing for other media
[-] flux@beehaw.org 2 points 1 year ago

You will reconsider calling strategy a backup should the filesystem get corrupted for whatever reason.

I've tested my full system backup restore once with btrfs. Worked out fine.

load more comments (1 replies)
[-] professed@beehaw.org 4 points 1 year ago

I started using Timeshift when it was included with a distro I was using and haven't had reason to shift away from it. Have already used it once to do a full restore.

[-] karce@wizanons.dev 4 points 1 year ago* (last edited 1 year ago)

I use btrfs snapshots and btrbk

btrfs is a great filesystem and btrbk complements it easily. Switching between snapshots is also really easy if something goes wrong and you need to restore.

Archwiki docs for btrfs: https://wiki.archlinux.org/title/Btrfs#Incremental_backup_to_external_drive

Of course you'd still want a remote location to backup to. You can use an encrypted volume with cloud storage. So google drive, etc all work.

[-] CoffeeBot@lemmy.ca 4 points 1 year ago

Oh interesting! I might take a look at btrbk

[-] privsecfoss@feddit.dk 2 points 1 year ago* (last edited 1 year ago)

Thanks. Heard a lot about it. Will check it.

[-] kamin@lemmy.kghorvath.com 2 points 1 year ago

This is what I do. Btrfs snapshots and use send/receive with my NAS.

[-] brandhout@feddit.nl 2 points 1 year ago

This is the way !

[-] I_Am_Jacks_____@beehaw.org 3 points 1 year ago

I've been using restic. It has built-in dedup & encryption and supports both local and remote storage. I'm using it to back up to a local restic-server (pointing to a USB drive) and Backblaze B2.

Restores for single or small sets of files is easy: restic -r $REPO mount /mnt Then browse through the filesystem view of your snapshots and copy just like any other filesystem.

[-] Ekis@beehaw.org 3 points 1 year ago

I just use rsync to backup my home folder to my NAS.

[-] TDCN@feddit.dk 3 points 1 year ago

Rsync is great but if you want snapshots and file history rsnapshot works pretty well. It's based on rsync but for every sync it creates shortcuts for existing files and only copies changes and new files. It saves space and remains transparent for the user. FreeFileSync is also amazing

[-] esm@beehaw.org 3 points 1 year ago* (last edited 1 year ago)

What problem are you trying to solve? Please think about that, and about your backup strategy, before you decide on any specific tools.

For example, here are several scenarios that I guard against in my backup strategy:

  • Accidentally delete a file, I want to recover it quickly (snapshots);
  • Entire drive goes kablooie, I want my system to continue running without downtime (RAID)
  • User data drive goes kablooie, I want to recover (many many options)
  • Root drive goes kablooie, I want to recover (baremetal recovery tools)
  • House burns down or computer is damaged/stolen (offsite backups)
[-] furrowsofar@beehaw.org 3 points 1 year ago

I am old school. I just use GNU Tar with the Pax format and multiple external detachable encypted hard drives. Reason is it is simple and a well known tool that is very common with a standard archive format.

[-] GnomeComedy@beehaw.org 2 points 1 year ago* (last edited 1 year ago)

I'm curious - how much data are you backing up with that method and how frequently are you doing your backups? Doesn't sound like it would scale well, but I'm also wondering if maybe this is perfect and I've just been over thinking it.

[-] furrowsofar@beehaw.org 3 points 1 year ago

There is not a size limit. Lot of these other methods actually use GNU Tar behind the scenes anyway. More then that GNU tar has been used for decades for this purpose. Pull out any Unix book from 2 decades ago and you will see "tar", "cpio", and "dump/restore" as the way. The new tool out there is Pax and in fact GNU Tar supports the new "pax" format. Moreover GNU Tar with Pax format can backup almost full disk structure including hard links, ACLs, and extended attributes which a lot of tools do not do. It is still useful to archive some things at a lower level like your partition table, and boot blocks of course. You also have to decide what run-level (such as rescue) you want to archive in, and/or what services you should stop, or provide separate to file system dumps for depending on your system. Databases, and things like ecryptfs take some special thought (thought it does for any tool). It is also good to do test restores to verify your disaster plan.

I use tar on many systems. My workstation is about 1TB of data. Backup is about 11 hours though I think it could be faster if I disabled compression (I currently use the standard gzip compression which is not optimal). I think the process is CPU bound by the compression at the moment. Going to uncompressed or using parallel gzip at level 2 is probably the fastest you can do and should really speed things up by 4X or more. I have played with this some for my wife and her raw backup is a lot faster now. My wife uses USB 3 external drives specifically plugged into USB 3 ports (the one with the SS symbol and the blue interior), and with a USB 3 related cable. I use 6TB naked SATA drives I insert into a hot mount enclosure and store in storage boxes. My backup system can theoretically do incrementals too, but it has some issues since I have moved to BTRFS so I do not use that at the moment. Did always use before. I have an idea how to fix, but need to debug and test incrementals now.

How often: I backup monthly. When my incrementals were working I use to do it weekly or whenever I got nervous. Other option for the BTRFS file systems would be to use their native backup tools. Not sure though, I like to use generic stuff. Lot to be said for generic.

Big downside of tar is the mind numbing man page. Getting the options correct takes some real thought. You also have to be comfortable with the shell and Bash scripting. Big upside you can customize exactly what you want.

[-] davefischer@beehaw.org 2 points 1 year ago

tar dates all the way back to the 70s.

[-] furrowsofar@beehaw.org 3 points 1 year ago

Yes, I actually did not know how far back, thanks. Wikipedia seems to say 1979. I know my system admin book dated 1992 talks about it and it was common then. I think my brother use to use it in the early 1980s for his job and maybe I did too a few times. Wikipedia says GNU Tar is newer and traces back to 1987. The formats have changed some and there are several. The PAX format is much newer which I think was standardized in 2001 but GNU Tar would have taken time to implement it. I do not know that date.

People seem to forget that tar worked well back then and still does.

[-] davefischer@beehaw.org 2 points 1 year ago

I had the chance to play with late 70s Unix for a bit a few years ago. (Hardware on loan from a museum.) VERY minimal, but still recognizable. (Well, my Unix reflexes are old - I started in the mid 80s.)

[-] furrowsofar@beehaw.org 3 points 1 year ago

Interesting. About then I was using a VAX. Somehow I spend most of my time on other stuff until I switched to Linux around 2000.

[-] davefischer@beehaw.org 2 points 1 year ago

My first Unix was 4.3BSD on a VAX-11/750. (There was another 11/750 running VMS, but I didn't like that nearly as much.)

load more comments (4 replies)
load more comments (1 replies)
[-] GadgeteerZA@beehaw.org 2 points 1 year ago

I've tried alternatives but I've stuck with LuckyBackup even though there have not been any updates for a while:

  1. It's rsync based - which is updated
  2. It has masses of GUI options including various include/exclude options, pre- and post-commands, etc.
  3. It's simple - I can browse inside the backed files and see what is going on, or just restore back one or two files.
  4. It updates cron itself.
[-] JohannesOliver@beehaw.org 2 points 1 year ago

Multiple. Locally I have Timeshift doing btrfs snapshots every so often. This is mostly to roll back to a snapshot if something breaks. I've never had to use it (and probably should).

I use Pika backup every once in a while for a local backup to an external drive. Mostly because it's easy to restore quickly.

I have duplicacy doing backups to a cloud provider. I used to use duplicati for this, and it was fine - although I didn't like that it seems to be forever in beta. I like that duplicacy can do deduplication between backups of different machines which most other solutions I've seen cannot. I like its selection of cloud providers vs Borg/Vorta and some others.

[-] scott@lem.free.as 2 points 1 year ago

ZFS snapshots and Borg(matic).

[-] ipkpjersi@lemmy.one 2 points 1 year ago* (last edited 1 year ago)

I use my own scripts with rsync etc, I don't back up my OS itself since I have installing it automated with scripts as well. I just back up specific things I need with my scripts.

automated with scripts

would you like to share those or do you have references for creating such scripts? this is on my to do list since years but I always struggle where to begin with.

[-] ipkpjersi@lemmy.one 2 points 1 year ago* (last edited 1 year ago)

They're very personalized to my setup, so they're not particularly useful in a general sense - I'd recommend something more like using this guide which seems to be pretty good: https://jumpcloud.com/blog/how-to-use-rsync-remote-backup-linux-system

Learning bash has been great for me, it's helped a ton being able to automate so many different things even just like installing and configuring specific applications to work the way I want, etc

I think a script to manually run for manual backups plus a different script to run for automatic backups scheduled via cronjob is a great way to go.

[-] LordChaos82@discuss.tchncs.de 2 points 1 year ago

For my Ubuntu desktop, I use the builtin backup tool to take backups on my NAS. For my homelab, I have everything running on Proxmox and my Proxmox backup server takes care of the homelab backups.

[-] ISOmorph@feddit.de 2 points 1 year ago

I use FreeFileSync. It's the only GUI tool I found that let's me sync folders while omitting file deletions. It lets you create batch files from the GUI that I execute with crontab multiple times per day.

[-] Klaymore@sh.itjust.works 2 points 1 year ago

I use NixOS so all my system configuration is already saved in my NixOS configs, which I save on GitHub. For dotfiles that aren't managed by NixOS I use syncthing to sync them between my devices, but no real backup cause I can just remake them if I need to, and things like my Neovim and VSCode configs are managed by my NixOS configs so they're backed up as well.

load more comments (2 replies)
[-] Holzkohlen@feddit.de 2 points 1 year ago* (last edited 1 year ago)

I just use a script on an systemd timer. Well two scripts on two timers really - one running daily, one weekly for different data. It's just a bunch of rsync commands copying folders to an hdd in my system and I reroute the output into a simple log file, mainly to verify if it ran at all. I am a bit paranoid about that. I can also run it manually whenever I want. Oh and some of the data I also rsync again to a smb cloud drive from Hetzner. I do not keep multiple versions and I delete remote files that have been deleted locally. It's just a 1:1 copy.
Oh and I use OpenSuse Tumbleweed so I have auto configured btrfs snapshots. Though I have not needed them yet and could not even say how I can use those. I figure that out once I need them.

load more comments
view more: next ›
this post was submitted on 13 Jun 2023
35 points (100.0% liked)

Free and Open Source Software

17622 readers
50 users here now

If it's free and open source and it's also software, it can be discussed here. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS