-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 The Monday 2007-05-07 at 13:55 +0100, David Bolt wrote:
I'd use something like this:
par2 c -s1024000 -c235 -l <basename.for.par2.archives> *
c is to create the recovery files -s1024000 gives a recovery block size of a little under 1MB -c235 says to create 235 recovery blocks -l limits the size of the par2 recovery files to just a bit bigger than the largest file.
That should create a few recovery files which, with the par2 overheads, occupy about 235MB and leave around 15MB free.
I had already done one run using simply the default options (ie, none, meaning 5% redundancy, except for a -500). Even though I compiled with "-O3 -march=pentium4" it is terribly slow, almost one hour. Right now I'm running it like (275M free): time nice par2 c -m500 -s1024000 -c250 -l recovery *avi .... Block size: 1024000 Source file count: 11 Source block count: 4305 Recovery block count: 250 Recovery file count: 8 ... Memory usage is currently: NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 10 250m 247m 664 R 53.3 24.4 2:26.62 par2
Once it's finished, and if you're the sort of person that just has to be sure, you can verify the freshly created files using:
par2 v <basename.for.par2.archives>
Yep, I did that on the dvd after burning it. It took 15 minutes.
And, if there's a failure after the contents has been burnt, copy the contents off the DVD using either dd or ddrescue, and then use:
par2 r <basename.for.par2.archives>
Now, the bad news is that for a dozen files, totalling a bit over 4GB, you may not be able to rebuild a broken file with only 235 blocks without rescuing as much data as possible from the DVD. My guess is that the files are around 350MB[0], which means you'd need at least 350-360 recovery blocks to rebuild a completely missing file. As long as only one file is broken, and you manage to recover more than a third of the data, there _should_ be enough to rebuild it.
Well, I assume I would be able to recover a damage of less than 5%, ie, about 210MiB. If the damage affects only some sectors and I can read the damaged file with errors ignored (supposedly, dd_rescue does that), it should be repairable. And, in any case, it's better than nothing ;-) Also, my usual practice is to burn two copies on DVD (ie, 2 DVDs), so if a file is damaged I can probably recover it from the other copy.
There are ways to reduce this problem, and the one I chose was to limit the size of files to 100MB[1]. That, combined with my using 535 blocks means I can have 5 completely unreadable files before I am unable to recover. And when I want to recombine the split files, I just use cat :-)
A possibility, yes...
[0] After rounding to the nearest MB: 4.35GB - 250MB = 4.1GB 4.1GB / 12 = 350MB
Yes, your assumption is correct.
[1] split -b 100M -a 3 -d <filename> <filename>. ^ That is to allow creation of names in the format filename.000, filename.001, etc.
Yep, I use the basename "recovery", making it obvious what they are. - -- Cheers, Carlos E. R. -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.5 (GNU/Linux) Comment: Made with pgp4pine 1.76 iD8DBQFGQh7vtTMYHG2NR9URAhefAJoCnXL+ud3nJpp8/baZhEZ0flQ7rQCfTXP0 R5/UuwMgmdjxAan8JNzf4XA= =dMSb -----END PGP SIGNATURE----- -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org For additional commands, e-mail: opensuse+help@opensuse.org