How do you guys handle backups and how often do you do it?
I know I’m not doing particularly well. Once in a blue moon I’ll copy over files from my main drive onto my secondary drive. But I’m not doing anything fancy - literally copy the Documents and a few other folders and that’s it. I’m not compressing anything. I’m still keeping that secondary drive connected to my PC so if I got a virus, all that data could be infected. I also store some files on my Gdrive and OneDrive but those have long since filled up and I rarely bother to go through them to delete what I didn’t need anymore.
I feel whatever backup tools Windows has built in are probably worthless, but then again, I could be totally wrong on that.
You are on a trip to disaster. Trust me, I do this for a living. One day you’re going to have a horrible surprise. I once had a guy get fired right there on a support call with me, he lost years worth of data because he wasn’t following good archival processes.
For consumer stuff:
Buy a cheap NAS, plenty out there. Even one with just two drives is better than nothing (that’s what I do). Splurge and get one that does RAID-5, you’ll thank me one day. By the way, I’ve used WD stuff for a long time, and it’s been the most reliable in my experience even though their customer service is a shit show to deal with.
1a. A cheaper, but less effective option, just buy two drives and see if your BIOS supports RAID (most modern motherboards do). If not, well you can do it in you OS too, but hardware RAID is always better.
Subscribe to a service like Google Drive, or One Drive, or Dropbox, or whatever you prefer. If you’re uncomfortable about putting stuff in the cloud then encrypt it first (VeraCrypt, GPG4Win, Password protected ZIP files even).
If you are running a business, definitely go with a good NAS, AND buy a tape library and get into a routine of rotating out the tapes and storing them off site (tapes are no use to you if your building get broken into, or burns down). And, use cloud storage too.
Just a point of clarification: Don’t use RAID 5 for more than 2-4 TB. The rebuild takes so long that the mean-time-between read errors statistic basically guarantees a read error while rebuilding, which may cause the controller to trash the array.
That and rebuilding that much data might push one of the drives over the edge anyway.
They do offer personal back up. Can do it fully encrypted too. Price used to be $6/month… is now $9/month, but if you prepay for 12 months then they tend to cut you more of a deal.
Can complain about the price, but the fact is hard drives also cost money and you basically never have to worry about losing your stuff. Even if you have hard drives, it could all be wiped out in a fire. Cloud has its advantages.
All my important stuff is stored on my NAS that runs truenas. Several times a day data is replicated to another truenas box in the same rack and at night data is rclone’d to Wasabi object storage.
For stuff on my PC like my Firefox and thunderbird profiles I use macrium backup whose backups get stored on my NAS.
I have a rack in my garage, all my servers and my gaming rig are in it - fiber cables run through my attic to my office to connect the front end of my gaming rig (monitor and usb c hub with peripherals).
My domain and all of my services are either VM’s on vmware or xcp-ng, or dockers on a VM. It’s all VM’s, except for the gaming rig and the Veeam backup server - they are bare metal.
VMware VM’s are backed up with Veeam to a bare metal windows 2019 backup server (because its backup its the only bare metal server in the rack). First copy goes to internal RAID5 array. Second copy goes to ISCSI target that is netgear NAS in RAID5. Third copy goes to Wasabi.
Most docker data is on the docker VM’s themselves, but if i need to mess with the data on the docker, that data is on a RAID5 Synology NAS. This gets backed up to Wasabi via duplicati. The system backup for the bare metal backup server also gets duplicati’ed to wasabi to make recovery from a site level disaster a little bit easier.
Everything XCP-NG gets backed up first copy to an UnRaid NAS with 2 parity disks, then copied to Wasabi. I will eventually get a second local target - I have some promising candidates in the shop, I just need time to diagnose and repair them.
Every Windows server and my gaming rig gets backed up direct to cloud with Redstor since I have extra space in the NFR bucket where I work and I manage that product. Gaming rig has no other backup so that’s nice (most game installs not backed up but everything else is). Other windows servers are backed up either via the Veeam chain or the XCP-NG chain, but it’s nice to have a secondary backup. I would do the Linux ones as well but Redstor sucks with Linux right now and it’s not worth the hassle.
Plex media is on another UnRaid box with 2 parity drives, but no other backup. This content can all be re-downloaded via automated systems if the array ever fails. I have had the hardware that data is on fail several times but it’s always been recoverable, knock on wood.
I am slowly working on phasing out VMware/Veeam in favor of XCP-NG, but it’s a long process.
I’m using Genius Scan+ on my phone and bought the cloud backup option for like $3 one-off, that enables automated exports to dropbox, google drive and a bunch of other services. Every document I receive is scanned and adequately named right away, and then automatically exported to both google drive and dropbox.
The dropbox client then again runs on my laptop and desktop and automatically syncs new files to the local folders, so I have the original scan on my phone plus two cloud backups and the local copies of the cloud backups on another two devices.
The original documents are kept in physical folders, neatly stored at home.
In case the important document is a digital copy only, I will export it from my mailbox directly to the dropbox & google drive, so it’s the same as above minus the copy on my phone. Depending on how important it is, I might also print a copy for safekeeping and/or forward it to a secondary email should I ever lose access to my primary.
I am terrible at it too, but slowly getting better. I have recently set up restic to automatically run daily backups from a digital ocean droplet where I’m hosting my website to my home NAS. It’s a great tool, it can encrypt data, send it to a remote location and it does incremental backups so it doesn’t take up that much space.
Long term, my plan is to do a similar setup to back up important data to a cloud (probably a separate digital ocean server). Of course, I won’t be backing up my Linux distros, only the important stuff like documents, photos, personal projects.
I’m also considering using blu-ray discs. They are convenient, can store large amounts of data (well… for me 25GB is a lot), are offline physical discs that don’t degrade that easily if properly stored. Also, once burned they are read only which is great for backups.
I sync specific locations on my PC and laptop with my Nextcloud server. This means I’ll have 3 copies. The Nextcloud server also keeps snapshots in case the wrong data is synced to all devices.
My important stuff gets backed up to a personal S3 bucket. Stuff I use regularly goes to my Google Drive as well. I’ve got my personal server that’s has 80TB of raid space, but that’s data that I can afford to lose.
I bought a Synology NAS with 4 bays and set up Raid 6. This provides 2 drive failure protection. All files on my computer are automatically sync’d to the NAS via Synology’s self hosted cloud drive service. This provides the additional benefit of version history of files. The NAS is backed up to a single large drive on a regular basis. That drive is stored off site.
windows built it backup is actually ok, you can specify which folders are backed up and how often
i use nas, just an old computer running linux in my laundry room with a bunch of drives that i save everything to, and backup everything monthly to a drive i keep in my drawer and a second drive i keep in my locker at work
easiest is probably to get an external drive and copy everhing to it periodically, keeping it unplugged
so youre safe from virus, accidental deletion, power supply failing and frying it, lightning things like that
I’m a Linux guy.
My media library lives on spinning rust drives in my NAS. The original CDs/DVDs/Blu-Rays etc. are one backup, a copy on a USB HDD serves as another.
Most personal files on my computer are synched to my laptop and/or phone via Syncthing. This isn’t strictly speaking a “backup” but it is a redundancy that can survive some failures. I back up my desktop nightly via BackInTime (a front-end for rsync) to one of three external USB hard disks. I cycle through these drives weekly, the freshest one goes to my parents’ house as an offsite backup (I store his off-site backups as well).
Applications and Software, I use a tool provided by my distro which essentially keeps a list of the installed packages, which can then be bulk-installed by my distros package manager. I don’t really bother trying to archive my applications or take full image backups of my system because I’ve found just doing a fresh install and then restoring a backup isn’t much slower or more complicated.
ALL of my backups are done locally. I use no cloud services, and those hard drives me and my father swap as off-site backups are transported physically.
Serious question:
How do you guys handle backups and how often do you do it?
I know I’m not doing particularly well. Once in a blue moon I’ll copy over files from my main drive onto my secondary drive. But I’m not doing anything fancy - literally copy the Documents and a few other folders and that’s it. I’m not compressing anything. I’m still keeping that secondary drive connected to my PC so if I got a virus, all that data could be infected. I also store some files on my Gdrive and OneDrive but those have long since filled up and I rarely bother to go through them to delete what I didn’t need anymore.
I feel whatever backup tools Windows has built in are probably worthless, but then again, I could be totally wrong on that.
Curious how real people handle this.
You are on a trip to disaster. Trust me, I do this for a living. One day you’re going to have a horrible surprise. I once had a guy get fired right there on a support call with me, he lost years worth of data because he wasn’t following good archival processes.
For consumer stuff:
If you are running a business, definitely go with a good NAS, AND buy a tape library and get into a routine of rotating out the tapes and storing them off site (tapes are no use to you if your building get broken into, or burns down). And, use cloud storage too.
Just a point of clarification: Don’t use RAID 5 for more than 2-4 TB. The rebuild takes so long that the mean-time-between read errors statistic basically guarantees a read error while rebuilding, which may cause the controller to trash the array.
That and rebuilding that much data might push one of the drives over the edge anyway.
UNRAID for larger amounts?
Use a service like backblaze
Checked it out and (at least on mobile), you can’t even see what the pricing is like. Seems aimed more at businesses as well.
They do offer personal back up. Can do it fully encrypted too. Price used to be $6/month… is now $9/month, but if you prepay for 12 months then they tend to cut you more of a deal.
Can complain about the price, but the fact is hard drives also cost money and you basically never have to worry about losing your stuff. Even if you have hard drives, it could all be wiped out in a fire. Cloud has its advantages.
My Internet is fast enough where I think cloud storage would be reasonable.
Basically, you run their program and it automatically backs up any new changes on your machine
I’m a pro photographer and Backblaze has saved my butt multiple times.
I email everything to myself like an old lady, and pay Google for extra storage. 🤫
All my important stuff is stored on my NAS that runs truenas. Several times a day data is replicated to another truenas box in the same rack and at night data is rclone’d to Wasabi object storage.
For stuff on my PC like my Firefox and thunderbird profiles I use macrium backup whose backups get stored on my NAS.
I have a rack in my garage, all my servers and my gaming rig are in it - fiber cables run through my attic to my office to connect the front end of my gaming rig (monitor and usb c hub with peripherals).
My domain and all of my services are either VM’s on vmware or xcp-ng, or dockers on a VM. It’s all VM’s, except for the gaming rig and the Veeam backup server - they are bare metal.
VMware VM’s are backed up with Veeam to a bare metal windows 2019 backup server (because its backup its the only bare metal server in the rack). First copy goes to internal RAID5 array. Second copy goes to ISCSI target that is netgear NAS in RAID5. Third copy goes to Wasabi.
Most docker data is on the docker VM’s themselves, but if i need to mess with the data on the docker, that data is on a RAID5 Synology NAS. This gets backed up to Wasabi via duplicati. The system backup for the bare metal backup server also gets duplicati’ed to wasabi to make recovery from a site level disaster a little bit easier.
Everything XCP-NG gets backed up first copy to an UnRaid NAS with 2 parity disks, then copied to Wasabi. I will eventually get a second local target - I have some promising candidates in the shop, I just need time to diagnose and repair them.
Every Windows server and my gaming rig gets backed up direct to cloud with Redstor since I have extra space in the NFR bucket where I work and I manage that product. Gaming rig has no other backup so that’s nice (most game installs not backed up but everything else is). Other windows servers are backed up either via the Veeam chain or the XCP-NG chain, but it’s nice to have a secondary backup. I would do the Linux ones as well but Redstor sucks with Linux right now and it’s not worth the hassle.
Plex media is on another UnRaid box with 2 parity drives, but no other backup. This content can all be re-downloaded via automated systems if the array ever fails. I have had the hardware that data is on fail several times but it’s always been recoverable, knock on wood.
I am slowly working on phasing out VMware/Veeam in favor of XCP-NG, but it’s a long process.
So later this week?
I’m using Genius Scan+ on my phone and bought the cloud backup option for like $3 one-off, that enables automated exports to dropbox, google drive and a bunch of other services. Every document I receive is scanned and adequately named right away, and then automatically exported to both google drive and dropbox.
The dropbox client then again runs on my laptop and desktop and automatically syncs new files to the local folders, so I have the original scan on my phone plus two cloud backups and the local copies of the cloud backups on another two devices.
The original documents are kept in physical folders, neatly stored at home.
In case the important document is a digital copy only, I will export it from my mailbox directly to the dropbox & google drive, so it’s the same as above minus the copy on my phone. Depending on how important it is, I might also print a copy for safekeeping and/or forward it to a secondary email should I ever lose access to my primary.
I am terrible at it too, but slowly getting better. I have recently set up restic to automatically run daily backups from a digital ocean droplet where I’m hosting my website to my home NAS. It’s a great tool, it can encrypt data, send it to a remote location and it does incremental backups so it doesn’t take up that much space.
Long term, my plan is to do a similar setup to back up important data to a cloud (probably a separate digital ocean server). Of course, I won’t be backing up my Linux distros, only the important stuff like documents, photos, personal projects.
I’m also considering using blu-ray discs. They are convenient, can store large amounts of data (well… for me 25GB is a lot), are offline physical discs that don’t degrade that easily if properly stored. Also, once burned they are read only which is great for backups.
I sync specific locations on my PC and laptop with my Nextcloud server. This means I’ll have 3 copies. The Nextcloud server also keeps snapshots in case the wrong data is synced to all devices.
My important stuff gets backed up to a personal S3 bucket. Stuff I use regularly goes to my Google Drive as well. I’ve got my personal server that’s has 80TB of raid space, but that’s data that I can afford to lose.
I bought a Synology NAS with 4 bays and set up Raid 6. This provides 2 drive failure protection. All files on my computer are automatically sync’d to the NAS via Synology’s self hosted cloud drive service. This provides the additional benefit of version history of files. The NAS is backed up to a single large drive on a regular basis. That drive is stored off site.
I’m using vorta with borgbackup. It’s a set and forget solution with very reasonable pricing. Saved my ass a few times already.
windows built it backup is actually ok, you can specify which folders are backed up and how often
i use nas, just an old computer running linux in my laundry room with a bunch of drives that i save everything to, and backup everything monthly to a drive i keep in my drawer and a second drive i keep in my locker at work
easiest is probably to get an external drive and copy everhing to it periodically, keeping it unplugged so youre safe from virus, accidental deletion, power supply failing and frying it, lightning things like that
I’m a Linux guy. My media library lives on spinning rust drives in my NAS. The original CDs/DVDs/Blu-Rays etc. are one backup, a copy on a USB HDD serves as another. Most personal files on my computer are synched to my laptop and/or phone via Syncthing. This isn’t strictly speaking a “backup” but it is a redundancy that can survive some failures. I back up my desktop nightly via BackInTime (a front-end for rsync) to one of three external USB hard disks. I cycle through these drives weekly, the freshest one goes to my parents’ house as an offsite backup (I store his off-site backups as well). Applications and Software, I use a tool provided by my distro which essentially keeps a list of the installed packages, which can then be bulk-installed by my distros package manager. I don’t really bother trying to archive my applications or take full image backups of my system because I’ve found just doing a fresh install and then restoring a backup isn’t much slower or more complicated.
ALL of my backups are done locally. I use no cloud services, and those hard drives me and my father swap as off-site backups are transported physically.
I have a few different drives that I mirror my documents folder to, then upload the most important stuff to cloud and thumb drive too.