![](https://seccdn.libravatar.org/avatar/04ee796e6200c64ba358cd3120b49690.jpg?s=120&d=mm&r=g)
Hi Folks, BLUF: How to backup large data stores? Say I've got 300-TB of on-line data, most if it is fairly static, while some of it changes frequently on a daily basis. I do daily incremental backups on the dynamic areas to disk filesystems on separate computers, with a monthly full disk image being stored off-line at an off-site location. But I'm worried about the bulk of the data that isn't being backed up. Most of the data consists of lots of binary files stored on multiple hardware RAID-6 arrays. The arrays have hot-spare disks, and I've got spares on the shelf. But as the graybeards know, reliable RAID isn't backup! One technology being considered is LTO tape. LTO-8 is due out any day now that claims to store 12-TB native. A drive or two with a tape library would possibly fill the requirement. Does anyone have any thoughts/advice? What do clouds like Google and AWS do for backups? Thanks, Lew -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org