this post was submitted on 04 Jul 2023
143 points (98.0% liked)

Selfhosted

40387 readers
389 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I have a home server that I’m using and hosting files on it. I’m worried about it breaking and loosing access to the files. So what method do you use to backup everything?

top 50 comments
sorted by: hot top controversial new old
[–] Anon819450514@lemmy.ca 38 points 1 year ago* (last edited 1 year ago) (4 children)

Backblaze on a B2 account. 0.005$ per gb. You pay for the storage you use. You pay for when you need to download your backup.

On my truenas server, it's easy as pie to setup and easy as 🥧 to restore a backup when needed.

[–] andrew@lemmy.stuart.fun 7 points 1 year ago* (last edited 1 year ago)

I'll add to this that restic works amazingly with Backblaze. Plus a dozen or so other backup options.

[–] Wxfisch@lemmy.world 5 points 1 year ago

I also recommend B2, it’s an S3 compatible service so any backup software/scripts/plugins that work with S3 should work with Backblaze.

[–] anonymoose@lemmy.ca 4 points 1 year ago

B2 is awesome. I have Duplicati set up on OpenMediaVault to backup my OS nightly to B2 (as well as a local copy to the HDD).

[–] burndown@sh.itjust.works 3 points 1 year ago (1 children)

Maybe I'm stupid, but what is B2? A Backblaze product?

[–] anteaters@feddit.de 3 points 1 year ago (1 children)

Yes it's their cloud storage.

[–] burndown@sh.itjust.works 2 points 1 year ago (2 children)

I didn't realize they did anything other than that!

[–] anteaters@feddit.de 2 points 1 year ago

I think they had some form of cloud computing at some point but they now focus on B2 and some backup tools that utilize B2.

load more comments (1 replies)
[–] TheWozardOfIz@sh.itjust.works 21 points 1 year ago (5 children)
[–] dustojnikhummer@lemmy.world 4 points 1 year ago

The "small to medium business" route I see!

[–] President_Pyrus@feddit.dk 2 points 1 year ago

And using the fact that raid is a backup!

load more comments (3 replies)
[–] knoland@kbin.social 15 points 1 year ago* (last edited 1 year ago) (2 children)

You guys back up your server?

load more comments (2 replies)
[–] writ@lemmy.world 13 points 1 year ago
[–] DrWeevilJammer@lm.rdbt.no 9 points 1 year ago
[–] JubilantJaguar@lemmy.world 9 points 1 year ago (1 children)

ITT: lots of the usual paranoid overkill. If you do rsync with the --backup switch to a remote box or a VPS, that will cover all bases in the real world. The probability of losing anything is close to 0.

The more serious risk is discovering that something broke 3 weeks ago and the backups were not happening. So you need to make sure you are getting some kind of notification when the script completes successfully.

load more comments (1 replies)
[–] cnk@kbin.dk 8 points 1 year ago

cronjobs with rsync to a Synology NAS and then to Synology's cloud backup.

[–] cyborg@lemmy.dbzer0.com 8 points 1 year ago

If you need to back up less than 10 GB, you can back up your data to Backblaze B2 Cloud Storage for free with their b2 sync command. I use this in a cron job daily or hourly, depending on the data being backed up.

[–] satanmat@lemmy.world 7 points 1 year ago (1 children)

3-2-1

Three copies. The data on your server.

  1. Buy a giant external drive and back up to that.

  2. Off site. Backblaze is very nice

How to get your data around? Free file sync is nice.

Veeeam community version may help you too

[–] wgs@lemmy.sdf.org 8 points 1 year ago (1 children)

I'm not sure how you understand the 3-2-1 rule given how you explained it, even though you're stating the right stuff (I'm confused about your numbered list..) so just for reference for people reading that, it means that your backups need to be on:

  • 3 copies
  • 2 mediums
  • 1 offsite location
load more comments (1 replies)
[–] jason@lemmy.weiser.social 6 points 1 year ago (1 children)

Proxmox Backup Server. It's life-changing. I back up every night and I can't tell you the number of times I've completely messed something up only to revert it in a matter of minutes to the nightly backup. You need a separate machine running it--something that kept me from doing it for the longest time--but it is 100% worth it.

I back that up to Backblaze B2 (using Duplicati currently, but I'm going to switch to Kopia), but thankfully I haven't had to use that, yet.

[–] dustojnikhummer@lemmy.world 2 points 1 year ago (4 children)

PBS backs up the host as well, right? Shame Veeam won't add Proxmox support. I really only backup my VMs and some basic configs

[–] DemonSlayerB@lemmy.world 2 points 1 year ago

Veeam has been pretty good for my HyperV VMs, but I do wish I could find something a bit better. I've been hearing a lot about Proxmox lately. I wonder if it's worth switching to. I'm a MS guy myself so I just used what I know.

load more comments (3 replies)
[–] Curious_Canid@lemmy.ca 5 points 1 year ago (2 children)

My server runs Plex and has almost 50 TB of video on it. After looking at all the commercial backup options I gave up on backing up that part of the data. :-(

I do backup my personal data, which is less than a terrabyte at this point. I worked out an arrangement with a friend who also runs a server. We each have a drive in the other's server that we use for backup. Every night cron runs a simple rsync script to do an incremental backup of everything new to the other machine.

This approach cost nothing beyond getting the drives. And we will still have our data even if one of the servers is physically destroyed and unrecoverable.

[–] lom@sh.itjust.works 2 points 1 year ago (1 children)

Oh that whith the friend's server is a good idea. Mutual benefit at little extra cost

load more comments (1 replies)
[–] Wxfisch@lemmy.world 2 points 1 year ago (1 children)

I also have a decent amount of video data for Plex (not nearly 50TB, but more than I want I pay to backup). I figure if worst comes to worst I can rip DVD/BluRays again (though I’d rather not) so I only backup file storage from my NAS that my laptops and desktop backup to. It’s just not worth the cost to backup data that’s fairly easy to replace.

load more comments (1 replies)
[–] Hizeh@hizeh.com 5 points 1 year ago

Restic to multiple repositories, local and remote.

[–] BigDev@lemmy.world 5 points 1 year ago

I am lucky enough to have a second physical location to store a second computer, with effectively free internet access (as long as the data volume is low, under about 1TB/month.)

I use the ZFS file system for my storage pool, so backups are as easy as a few commands in a script triggered every few hours, that takes a ZFS snapshot and tosses it to my second computer via SSH.

[–] Ferawyn@lemmy.world 5 points 1 year ago

Various different ways for various different types of files.

Anything important is shared between my desktop PC's, servers and my phone through Syncthing. Those syncthing folders are all also shared with two separate servers (in two separate locations) with hourly, daily, weekly, monthly volume snapshotting. Think your financial administration, work files, anything you produce, write, your main music collection, etc... It's also a great way to keep your music in sync between your desktop PC and your phone.

Servers have their configuration files, /etc, /var/log, /root, etc... rsynced every 15 minutes to the same two backup servers, also to snapshotted volumes. That way, should any one server burn down, I can rebuild it in a trivial amount of time. This also goes for user profiles, document directories, ProgramData, and anything non-synced on windows PC's.

Specific data sets, like database backups, repositories and such are also generally rsynced regularly, some to snapshotted volumes, some to regulars, depending on the size and volatility of the data.

Bigger file shares, like movies, tv-shows, etc... I don't backup, but they're stored on a distributed GlusterFS, so if any one server goes down, that doesn't lose me everything just yet.

Hardware will fail, sooner or later. You should see any one device as essentially disposable, and have anything of worth synced and archived automatically.

[–] mariom@lemmy.world 5 points 1 year ago (1 children)

Autorestic, nice wrapper for restic.

Data goes from one server to second server, and vice versa (different provider, different geolocation). And to backblaze B2 - as far as I know cheapest s3-like storage

load more comments (1 replies)
[–] JessMarie@lemmy.world 5 points 1 year ago

So what method do you use to backup everything?

Depends on what OS that server is running. Windows, Unraid, Linux, NAS (like Synology or QNAP), etc.

There are a bazillion different ways to back up your data but it almost always starts with "how is your data being hosted/served?"

[–] greenhan3le@lemmy.world 5 points 1 year ago

Borgbase to borgbase

Rock solid for years

Borgbackup to cloud storage on rsync.net

[–] redcalcium@c.calciumlabs.com 4 points 1 year ago

The simplicity of containerized setup:

  • docker-compose and kubernetes yaml files are preserved in a git repo
  • nightly cron to create database dumps
  • nightly cron to run rsync to backup data volumes and database dumps to rsync.net
[–] TheWiz@lemm.ee 4 points 1 year ago* (last edited 1 year ago)

I run everything in docker. I have an ansible playbook that backs up all the docker volumes to a minio server I'm running on a separate machine. I periodically upload backups to idrivee2 with the same playbook

Hourly backups with Borg, nightly syncs to B2. I've been playing around with zfs snapshots also, but I don't rely on them yet

[–] humancrayon@sh.itjust.works 4 points 1 year ago* (last edited 1 year ago)

I have everything in its own VM, and Proxmox has a pretty awesome built in backup feature. Three different backups (one night is to my NAS, next night to an on-site external, next night to an external that's swapped out with one at work - weekly). I don't backup the Proxmox host because reinstalling it should it die completely is not a big deal. The VM's are the important part.

I have a mini PC I use to spot check VM backups once a month (full restore on its own network, check its working, delete the VM after).

My Plex NAS only backs up the movies I really care about (everything else I can "re-rip from my DVD collection").

[–] conrad82@lemmy.world 3 points 1 year ago

I use proxmox server and proxmox backup server (in a VM 🫣) to do encrypted backups.

A raspberry pi has ssh access to PBS and it rsync all the files, and then uploads them to backblaze using rclone.

https://2.5admins.com/ recommended "pull" backups, so if someone hacks your server they don't have access to your backups. If the pi is hacked it can mess with everything, but the idea is that is has a smaller attack surface (just ssh).

PS. If you rclone a lot of files to backblaze use https://rclone.org/docs/#fast-list , or else it will get expensive

[–] hank_and_deans@lemmy.ca 3 points 1 year ago* (last edited 1 year ago)

Borgbackup, using borgmatic as a frontend, to a storage VPS. I backup dozens of machines this way. I simply add a user account for each machine on the VPS, then each machine backs up over ssh to its own account.

[–] tj@kbin.chat 3 points 1 year ago

restic backup to Azure and Backblaze

[–] nnullzz@kbin.social 3 points 1 year ago

Running a Duplicacy container backing up to Google drive for some stuff and Backblaze for mostly all other data. Been using it for a couple years with no issues. The GUI and scheduling is really nice too.

[–] wgs@lemmy.sdf.org 3 points 1 year ago* (last edited 1 year ago)

For config files, I use tarsnap. Each server has its own private key, and a /etc/tarsnap.list file which list the files/directories to backup on it. Then a cronjob runs every week to run tarsnap on them. It's very simple to backup and restore, as your backups are simply tar archives. The only caveat is that you cannot "browse" them without restoring them somewhere, but for config files it's pretty quick and cheap.

For actual data, I use a combination of rclone and dedup (because I was involved in the project at some point, but it's similar to Borg). I sync it to backblaze because that's the cheapest storage I could find. I use dedup to encrypt the backup before sending it to backblaze though. Restoration is very similar to tarsnap:

dup-unpack -k keyfile snapshot-yyyymmdd | tar -C / -x [files..] .

Most importantly, I keep a note on how to backup/restore: Backup 101

[–] please_lemmy_out@lemmy.world 3 points 1 year ago (1 children)

I use Duplicati and backup server to both another PC and the cloud. Unlike a lot of data hoarders I take a pretty minimalist approach to only backing up core (mostly docker) configs and OS installation.

I have media lists but to me all that content is ephemeral and easily re-acquired so I don't include it.

[–] bookworm@feddit.nl 3 points 1 year ago

Duplicati is great in many ways but it's still considered as being in beta by it's developers. I would not trust it if the data you back up is extremely important to you.

[–] lp0101@lemmy.world 3 points 1 year ago (2 children)

Rsnapshot on a second server, saving 7 daily backups, 4 weekly backups, and 6 mk they backups

load more comments (2 replies)
[–] Bill@lemm.ee 3 points 1 year ago

Veeam Agent going to a NAS on-site and the NAS is backed up nightly to IDrive because it's the cheapest cloud backup service I could find with Linux support. It's a bit slow, very CPU-bound, but it's robust and their support is pretty responsive.

[–] f1g4@feddit.it 2 points 1 year ago (2 children)

A simple script using duplicity to FTP data on my private website with infinite storage. I can't say if it's good or not. It's my first time doing it.

load more comments (2 replies)
[–] NSA_Server_04@lemmy.world 2 points 1 year ago

Using ESXi as a hypervisor , so I rely on Veeam. I have copy jobs to take it from local to an external + a copy up to the cloud.

[–] hitagi@ani.social 2 points 1 year ago

Cronjobs and rclone have been enough for me for the past year or so. Interestingly, I've only needed to restore from a backup once after a broken update. It felt great fixing that problem so easily.

[–] chellomere@lemmy.world 2 points 1 year ago
load more comments
view more: next ›