Greg, et al --
...and then greg.freemyer@gmail.com said...
%
% On May 26, 2015 2:41:43 PM EDT, David T-G wrote:
% >
...
% > for F in ceiling ( $FREESPACE / 32G )
% > do
% > gzip -dc prepared.dev-zero.32G-bigfile.gz >/vol/tmp/BIGFILE.$F
% > done
% > rm /vol/tmp/BIGFILE.?
%
% Nifty trick but I suspect this is a better way to actually create the
% big zero filled files:
%
% dd if=/dev/zero of=/vol/tmp/BIGFILE.$F bs=10MB count=3200
Well, better depends on your point of view, I suppose. For me, it's much
cheaper to spend a few CPU cycles to gunzip a 140M file than to wait for
/dev/zero to spew out 32G of nulls. It's even faster than reading 32G
back from the target drive to write it again for the N>1 passes.
I'm quite happy to burn the tiny bit of extra space holding the file
in reserve, too; 140M out of 750G of scratch is not a bother.
%
% Why are you preparing a dev-zero file in advance?
Because the time saving is enormous :-)
%
% Greg
HANN
:-D
--
David T-G
See http://justpickone.org/davidtg/email/
See http://justpickone.org/davidtg/tofu.txt
--
To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org
To contact the owner, e-mail: opensuse+owner@opensuse.org