On 5/16/05, Joe Tseng wrote:
I need to archive a directory that happens to be >2GB in size... I already tried to use just plain "tar cf - <dir> | gzip > foobar.tar.gz" but I got an error I suspect due to the filesystem being unable to handle such a large archive. Is there a way for me to create a series of sequential archival files of more manageable size (say ~700MB)?
--
Joe Tseng
2 GB should not be a problem for any of the native Linux FS types. Your not trying to use a FAT FS to hold your archive are you? Regardless, you can do something like: tar cf - <dir> | split -b 1000m - to create a bunch of 1GB files that you can then compress, you will need to pay attention to the directory your in, etc. Maybe this will work better tar cf - <dir> | ( mkdir dest_dir; cd dest_dir; split -b 1000m - ) cd dest_dir gzip * then to expand cd dest_dir gunzip * cat * | tar ..... The only trouble with the above is you have to have enough free space on your archive device to hold the full uncompressed archive. A truly different approach is to use rdiff_backup. It works very well it your comfortable with the command-line and potentially difficult to read error messages. The only real bug I have seen recently is with individual files in excess of 25GB. Greg -- Greg Freemyer