Mailinglist Archive: opensuse (3349 mails)

< Previous Next >
Re: [SLE] OT: Archive to multiple volumes
  • From: Greg Freemyer <greg.freemyer@xxxxxxxxx>
  • Date: Mon, 16 May 2005 13:32:18 -0400
  • Message-id: <87f94c370505161032758e9d37@xxxxxxxxxxxxxx>
On 5/16/05, Joe Tseng wrote:
> I need to archive a directory that happens to be >2GB in size... I already
> tried to use just plain "tar cf - <dir> | gzip > foobar.tar.gz" but I got an
> error I suspect due to the filesystem being unable to handle such a large
> archive. Is there a way for me to create a series of sequential archival
> files of more manageable size (say ~700MB)?
> --
> Joe Tseng

2 GB should not be a problem for any of the native Linux FS types.

Your not trying to use a FAT FS to hold your archive are you?

Regardless, you can do something like:

tar cf - <dir> | split -b 1000m -

to create a bunch of 1GB files that you can then compress, you will
need to pay attention to the directory your in, etc.

Maybe this will work better

tar cf - <dir> | ( mkdir dest_dir; cd dest_dir; split -b 1000m - )
cd dest_dir
gzip *

then to expand
cd dest_dir
gunzip *
cat * | tar .....

The only trouble with the above is you have to have enough free space
on your archive device to hold the full uncompressed archive.

A truly different approach is to use rdiff_backup. It works very well
it your comfortable with the command-line and potentially difficult to
read error messages. The only real bug I have seen recently is with
individual files in excess of 25GB.

Greg Freemyer

< Previous Next >