On Tue, 12 Nov 2024 10:09:01 +0100 Roger Oberholtzer wrote:
On Mon, Nov 11, 2024 at 10:34 PM David C. Rankin <drankinatty@gmail.com> wrote:
On 11/11/24 2:43 PM, Dave Howorth wrote:
I just read a newspaper article about house fires, and it struck me that one of the things that might well be lost in such a mishap would be my financial records etc that are stored on my computer.
Having just lived though this example where my office was lost, offsite backups are critical.
For 20 years I had a simple cron job that called rsync to sync servers between home and work. As the firemen were up on top of the 2nd floor with a 60 foot hook and ladder with chainsaws running to cut holes in the roof and the 900 HP turbo-pumps blasting 1200 gal/min down the hole -- I was glad to have the data mirrored.
I'm a big fan of rsync. My worry in this case is that if something goes wrong on the main system (pre-fire), those changes could conceivably overwrite the correct stuff on the backup. Incremental backups to a new file each time are perhaps safest in that respect.
Having said that, I also use rsync to backup up from one work office to another so that should disaster strike, I'm more prepared. But the potential rsync thing worries me. I'm planning on switching to a backup system that does occasional full backups and regular incremental ones. My data to backup (on local RAID storage) is currently ~250GB core business data (must be safe!) and ~1TB makes-life-easier-if-it-exists data. So an efficient backup solution is needed. rsync does work well. Except for my concern...
I run a daily python backup script to a btrfs formatted, externally mounted hard drive. The initial backup is done by rsync, followed by a btrfs snapshot of that backup, followed by a btrfs delete of the oldest snapshots (currently 182 days ago). I think this offers some protection against the rsync problem of copying errors to the backup. It also provides incremental backups to restore older versions of files. Bob -- Bob Williams