Sandy Drobic wrote:
Per Jessen wrote:
Just thinking out loud:
when a drive in a 2x500Gb RAID1 array fails and is replaced, how long does it take for the array to recover? I can't help worry about the length of time in degraded mode - the longer, the higher the risk of a second drive failure. What do people do to overcome this - run 3- or 4-drive RAID1?
How long it takes for the RAID to recover depends a lot on the load of the server. If it is not too busy, it shouldn't take more than 4-8 hours. Granted, within that span of time the disks are under heavy load and the RAID is in degraded mode.
I'm just now running a recover of a 2 x 1Tb RAID1 - estimated to completion is currently 420min, so 7 hours. That's _much_ too long in degraded mode, IMHO.
Our new mailserver is indeed running on a RAID10: 4 x RAID1, this is to reduce the striping overhead and maximize io throughput. Disks are cheap, so why not take advantage of that?
Absolutely - but a RAID10 is still vulnerable to a dual-drive failure, and that's where I see the bigger risk when it takes 7 hours for a RAID1 to recover. If you don't have a hot spare, add to that whatever time it would take to replace the failed drive.
And of course: don't mistake RAID as a backup solution. (^-^)
Backup is not always an option. Back to the dual-drive failure situation - anyone running 3- or 4-drive RAID1s ? -- /Per Jessen, Zürich -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org For additional commands, e-mail: opensuse+help@opensuse.org