this post was submitted on 13 Jun 2023
127 points (97.7% liked)

Selfhosted

56953 readers
2040 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I see many posts asking about what other lemmings are hosting, but I'm curious about your backups.

I'm using duplicity myself, but I'm considering switching to borgbackup when 2.0 is stable. I've had some problems with duplicity. Mainly the initial sync took incredibly long and once a few directories got corrupted (could not get decrypted by gpg anymore).

I run a daily incremental backup and send the encrypted diffs to a cloud storage box. I also use SyncThing to share some files between my phone and other devices, so those get picked up by duplicity on those devices.

top 50 comments
sorted by: hot top controversial new old
[–] davad@lemmy.world 3 points 2 years ago (1 children)

Restic using resticprofile for scheduling and configuring it. I do frequent backups to my NAS and have a second schedule that pushes to Backblaze B2.

[–] fbartels@lemmy.one 1 points 2 years ago

Another +1 for restic. To simplify the backup I am however using https://autorestic.vercel.app/, which is triggered from systemd timers for automated backups.

[–] Oli@fedia.io 1 points 2 years ago (1 children)

In the process of moving stuff over to Backblaze. Home PCs, few clients PCs, client websites all pointing at it now, happy with the service and price. Two unraid instances push the most important data to an azure storage a/c - but imagine i'll move that to BB soon as well.
Docker backups are similar to post above, tarball the whole thing weekly as a get out of jail card - this is not ideal but works for now until i can give it some more attention.

*i have no link to BB other than being a customer who wanted to reduce reliance on scripts and move stuff out of azure for cost reasons.

[–] qwacko@lemmy.nz 1 points 2 years ago (1 children)

Would I be correct to assume you are using Backblaze PC backup rather than B2?

load more comments (1 replies)
[–] JASN_DE@feddit.de 1 points 2 years ago

A kind of "extended" 3-2-1, more a 4-3-2. As nearly everything I host runs on Docker, I usually pause the stack, .tar.bz everything and back that up on several devices (NAS, off-site machine, external HDD).

The neat thing about keeping every database in its own container is the resulting backup "package", which can easily be restored as a whole without having to mess with db dumps, permissions, etc.

[–] KitchenNo2246@lemmy.world 1 points 2 years ago (2 children)

I use borgbackup + zabbix for monitoring.

At home, I have all my files get backed up to rsync.net since the price is lower for borg repos.

At work, I have a dedicated backup server running borgbackup that pulls backups from my servers and stores it locally as well as uploading to rsync.net. The local backup means restoring is faster, unless of course that dies.

[–] gabe565@lemmy.cook.gg 1 points 2 years ago

+1 for Borg! I use Borgmatic to backup files and databases to BorgBase. It costs me $80/yr for 1TB of backups which I think is sensible. I also selfhost an instance of Healthchecks.io for monitoring.

[–] trashographer@vlemmy.net 0 points 2 years ago* (last edited 2 years ago)

moved from borg to restic. hourly backup of 6Tb mailserver is just fine

[–] OutrageousUmpire@lemmy.world 1 points 2 years ago* (last edited 2 years ago) (1 children)

I realized at one point that the amount of data that is truly irreplaceable to me amounts to only - 500GB. So for this important data I back up to my NAS, then from there backup to Backblaze. I also create M-Discs. Two sets, one for home and one I keep at a fiends’ place. Then because “why not” and I already had them sitting around I also backup to two sd cards and keep them on site and off site.

I also backup my other data like tv/movies/music/etc but the sheer volume of data gives me one option, that being a couple usb hard drives I back up to from my NAS.

load more comments (1 replies)
[–] conrad82@lemmy.world 1 points 2 years ago

I use syncthing to sync files between phone, pc and server.

The server runs proxmox, with a proxmox backup server in VM. A raspberry pi pulls the backups to an usb ssd, and also rclone them to backblaze.

Syncthing is nice. I don't backup my pc, as it is done by the server. Reinstalling the pc requires almost no preparation, just set up syncthing again

[–] paco@fedia.io 1 points 2 years ago

321 strategy: 3 copies of everything important, 2 on-site, 1 in cloud. I have a TrueNAS Scale NAS running RAID5 on ZFS. All the laptops, desktops, etc. backup to the NAS. (Mostly Macs, so we use time machine over the network). So the original laptop/desktop is 1 copy. The NAS is a second copy on-site, and then TrueNAS has lots of cloud options. I use Amazon S3 myself, but there are lots of choices.

Prior to this I had a Synology NAS. It was "small" (6TB), so it has a RAID mirror of 6TB drives and a single 6TB external USB that had a backup of the mirrored pair (second copy on-site). Then I also used Synology's software to backup to S3.

For my Internet-facing VMs, they all run in xcp-ng and I use Xen Orchestra to manage them. I run regular snapshots nightly, and then use NFS to copy them to a cloud server. That's sloppy, and sometimes doesn't work. So the in-the-house stuff is backed up well. The VMs are mostly relying on Xen snapshots and RAID 5.

[–] Faceman2K23@discuss.tchncs.de 1 points 2 years ago

I back up everything to my home server... then I run out of money and cross my fingers that it doesn't fail.

Honestly though my important data is backed up on a couple of places, including a cloud service. 90% of my data is replaceable, so the 10% is easy to keep safe.

[–] tomhellier@lemmy.ml 1 points 2 years ago

Cross my fingers 🤞

[–] dead@keylog.zip 1 points 2 years ago

What's my what lmao?

[–] Elbullazul@lem.elbullazul.com 1 points 2 years ago* (last edited 2 years ago) (3 children)

I run a restic backup to a local backup server that syncs most of the data (except the movie collection because it's too big). I also keep compressed config/db backups on the live server.

I eventually want to add a cloud platform to the mix, but for now this setup works fine

load more comments (3 replies)
[–] Sekoia@lemmy.blahaj.zone 1 points 2 years ago (2 children)

Backups? What backups?

Ik it's bad but I can't be bothered.

[–] palitu@lemmy.perthchat.org 1 points 2 years ago (1 children)
[–] xavier666@lemm.ee 0 points 2 years ago

Exactly! I pray every morning.

[–] nii236@lemmy.jtmn.dev 0 points 2 years ago
[–] huojtkeg@lemmy.world 1 points 2 years ago (2 children)
[–] The_Traveller101@feddit.de 1 points 2 years ago

Restic is so awesome and in combination with backblaze it’s probably the most cost effective solution.

[–] tgxn@lemmy.tgxn.net 0 points 2 years ago

Aha yeah, basically the same as me, B2 is cheap as, I get a bill for less than a dollar each month. 👍

[–] minimar@fedia.io 1 points 2 years ago

I just use duplicity and upload to Google drive.

I'm paying Google for their enterprise gSuite which is still "unlimited", and using rclone's encrypted drive target to back up everything. Have a couple of scripts that make tarballs of each service's files, and do a full backup daily.

It's probably excessive, but nobody was ever mad about the fact they had too many backups if they needed them, so whatever.

[–] linearchaos@lemmy.world 1 points 2 years ago

Irreplaceable media: NAS->Back blaze NAS->JBOD via duplicacy for versioning

Large ISOs that can be downloaded again, NAS -> JBOD and or NAS -> offline disks.

Stuff that's critical leaves the house, stuff that would just cost me a hell of a lot of personal time to rebuild just gets a copy or two.

[–] OutrageousUmpire@lemmy.world 0 points 2 years ago* (last edited 2 years ago)
[–] bbbutch@feddit.de 0 points 2 years ago (1 children)

i backup locally to a second NAS (daily)

i use rclone crypt to backup to the cloud (hetzner storage box, weekly)

the most important stuff i also backup to an external harddisk (from time to time, whenever i'm in the mood / have some spare time)

[–] steven@feddit.nl 0 points 2 years ago (1 children)

You basically described my backup strategy, although I do the Hetzner box daily too (on 1gbit synchronous fiber, so why not)

[–] bbbutch@feddit.de 0 points 2 years ago

if i had a better upload connection than my current 10MBit i would also do it daily :D

[–] skimdankish2@lemmy.world 0 points 2 years ago* (last edited 2 years ago)

I use....

  • Timeshift ->Local backup on to my RAID array
  • borgbackup -> borgbase online backup
  • GlusterFS -> experimenting with replicating certain apps across 2 raspberry pi's
[–] thegpfury@fedia.io 0 points 2 years ago

Veeam community for me. Cross backup locally between my 2 servers at home, and then a copy job to an offsite NAS.

Have had to restorations before, and never had any issues.

[–] aucubin@lemmy.aucubin.de 0 points 2 years ago

As I have all my data on my homeserver in VMs it’s currently only daily backups to the NAS with proxmox, but I should really add some remote NAS to have it backed up in case my local NAS breaks down.

[–] Totendax@feddit.de 0 points 2 years ago

I backup an encrypted and heavily compressed archive to my local nas and to google drive every night. NAS keeps the version from the first of every month and 7 days prior history and google drive just the latest

[–] hal@sopuli.xyz 0 points 2 years ago

restic + rclone crypt + whatever storage server/service is good enough. currently using hetzner storage for my backups. because they've auto snapshots on top of my backups.

I also use this setup for backups on servers, not only at home

[–] Bright5park@lemmy.world 0 points 2 years ago

I use RSnapshot and make incremental backups to an external harddrive, and (I know it's not a backup) run my two RAIDs (one for media, one for general data) in mirrored mode.

When I eventually upgrade my home server, I will upgrade from 2x2 2TB drives in RAID1 to four 8TB drives in either RAID5 or 6 - I am still undecided if I am willing to sacrifice 4TB of capacity to the redundancy gods and get an extra harddrive that can fail without data loss in return.

[–] ipipip@iusearchlinux.fyi 0 points 2 years ago (1 children)

I don't backup my personal files since they are all more or less contained in Proton Drive. I do run a handful of small databases, which i back up to ... telegram.

[–] ilikedatsyuk@lemmy.world 0 points 2 years ago (1 children)

Ah, yes, the ole' "backup a database to telegram" trick. Who hasn't used that one?!?

[–] trashographer@vlemmy.net 0 points 2 years ago (1 children)

I did. Split pgp tarball into 2gb files and download 600gb to saved messages

load more comments (1 replies)
[–] raphael@lemmy.mira.pm 0 points 2 years ago* (last edited 2 years ago)

I backup locally to my NAS with Synologys Drive software, the NAS does a 10 day rolling snapshot of the backup folder. First I then had Hyper Backup set up to do a versioned backup from the NAS to a cloud provider.

But I got scared of the thought that a corruption would propagate through the whole backup chain. So now I do an additional backup for the most important stuff directly from my PC with restic + resticprofile to a Hetzner storage box. I know they do not give any promises about data reliability, but I think chances of the local and remote backup breaking at the same time are pretty slim.

Restic is sending a fail/done ping to an uptime-kuma instance I host myself to monitor the backup which then notifies me with ntfy if backups fail or are missed for a couple of days.

[–] wpuckering@lm.williampuckering.com 0 points 2 years ago (3 children)

I run all of my services in containers, and intentionally leave my Docker host as barebones as possible so that it's disposable (I don't backup anything aside from data to do with the services themselves, the host can be launched into the sun without any backups and it wouldn't matter). I like to keep things simple yet practical, so I just run a nightly cron job that spins down all my stacks, creates archives of everything as-is at that time, and uploads them to Wasabi, AWS S3, and Backblaze B2. Then everything just spins back up, rinse and repeat the next night. I use lifecycle policies to keep the last 90 days worth of backups.

[–] palitu@lemmy.perthchat.org 1 points 2 years ago (1 children)

I like the cut of your jib!

Any details on the scripts?

load more comments (1 replies)
[–] TheWiz@lemm.ee 0 points 2 years ago

I basically do the exact same thing. A few mins of down time at 5am isn't gonna bother me.

I've tested a few times nuking my containers and volumes and kicking off my ansible playbooks which redeploy and restore from s3 backups.

Haven't had to redeploy from scratch yet though

load more comments (1 replies)
[–] sascamooch@lemmy.sascamooch.com 0 points 2 years ago

Fuck it, we ball.

I just wanted to say that I appreciate the input from everyone here. I really need to work on my backup solution and this will be helpful.

[–] thatsnothowyoudoit@lemmy.ca 0 points 2 years ago* (last edited 2 years ago)

Large/important volumes on SAN-> B2.

Desktop Macs -> Time Machine on SAN & Backblaze (for a few)

Borgbackup is great and what we used for all our servers when they were pets. It's a great tool, very easy to script and use.

[–] knaak@lemmy.world 0 points 2 years ago

I have a raspberry pi with an external drive with scripts to rsync each morning. Then I have S3 deep glacier backups for off site.

[–] matt@matts.digital 0 points 2 years ago (1 children)

All my servers use ZFS for data storage, they have VPN's between each other (just /30 P2P's). I use zfs-snapshot to take snapshots every 15 minutes and nightly jobs that do a ZFS send to dump everything to another machine with some storage.

load more comments (1 replies)
load more comments
view more: next ›