OT: Archive to multiple volumes
I need to archive a directory that happens to be >2GB in size... I already tried to use just plain "tar cf - <dir> | gzip > foobar.tar.gz" but I got an error I suspect due to the filesystem being unable to handle such a large archive. Is there a way for me to create a series of sequential archival files of more manageable size (say ~700MB)? -- Joe Tseng "I can be Googled. Therefore I am."
On 5/16/05, Joe Tseng wrote:
I need to archive a directory that happens to be >2GB in size... I already tried to use just plain "tar cf - <dir> | gzip > foobar.tar.gz" but I got an error I suspect due to the filesystem being unable to handle such a large archive. Is there a way for me to create a series of sequential archival files of more manageable size (say ~700MB)?
--
Joe Tseng
2 GB should not be a problem for any of the native Linux FS types. Your not trying to use a FAT FS to hold your archive are you? Regardless, you can do something like: tar cf - <dir> | split -b 1000m - to create a bunch of 1GB files that you can then compress, you will need to pay attention to the directory your in, etc. Maybe this will work better tar cf - <dir> | ( mkdir dest_dir; cd dest_dir; split -b 1000m - ) cd dest_dir gzip * then to expand cd dest_dir gunzip * cat * | tar ..... The only trouble with the above is you have to have enough free space on your archive device to hold the full uncompressed archive. A truly different approach is to use rdiff_backup. It works very well it your comfortable with the command-line and potentially difficult to read error messages. The only real bug I have seen recently is with individual files in excess of 25GB. Greg -- Greg Freemyer
On Mon, 2005-05-16 at 13:32 -0400, Greg Freemyer wrote:
On 5/16/05, Joe Tseng wrote:
I need to archive a directory that happens to be >2GB in size... I already tried to use just plain "tar cf - <dir> | gzip > foobar.tar.gz" but I got an error I suspect due to the filesystem being unable to handle such a large archive. Is there a way for me to create a series of sequential archival files of more manageable size (say ~700MB)?
--
Joe Tseng
2 GB should not be a problem for any of the native Linux FS types.
Your not trying to use a FAT FS to hold your archive are you?
Regardless, you can do something like:
tar cf - <dir> | split -b 1000m -
to create a bunch of 1GB files that you can then compress, you will need to pay attention to the directory your in, etc.
Maybe this will work better
tar cf - <dir> | ( mkdir dest_dir; cd dest_dir; split -b 1000m - ) cd dest_dir gzip *
then to expand cd dest_dir gunzip * cat * | tar .....
The only trouble with the above is you have to have enough free space on your archive device to hold the full uncompressed archive.
A truly different approach is to use rdiff_backup. It works very well it your comfortable with the command-line and potentially difficult to read error messages. The only real bug I have seen recently is with individual files in excess of 25GB.
Look into using mondo for the backup, you can set the size of the archive file using it. It is included in SuSE 9.3. -- Ken Schneider UNIX since 1989, linux since 1994, SuSE since 1998 "The day Microsoft makes something that doesn't suck is probably the day they start making vacuum cleaners." -Ernst Jan Plugge
That's what I'm doing now and it seems to be doing what I need. However,
going forward, how will someone who will need to get to this be able to
access the data? I saw that the files generated were labeled xaa, xab, xac,
etc. Are these just tar files? Can they just be copied to a directory and
untarred with "tar xf xa*"?
On 5/16/05, Greg Freemyer
Regardless, you can do something like:
tar cf - <dir> | split -b 1000m -
to create a bunch of 1GB files that you can then compress, you will need to pay attention to the directory your in, etc.
Joe Tseng wrote:
That's what I'm doing now and it seems to be doing what I need. However, going forward, how will someone who will need to get to this be able to access the data?
Remember - tar is Tape ARchive - it was built to deal with archives spanning multiple volumes. When you start untar'ing at file #1, tar will simply ask for the next "volume" when it is ready. -- /Per Jessen, Zürich
I'm not saving the archive to tape; I'm leaving it on my hard drive until
told otherwise. So on top of splitting up the archive, I also need to give
each file unique names. So far "split" is doing the trick...
On 5/16/05, Per Jessen
Joe Tseng wrote:
That's what I'm doing now and it seems to be doing what I need. However, going forward, how will someone who will need to get to this be able to access the data?
Remember - tar is Tape ARchive - it was built to deal with archives spanning multiple volumes. When you start untar'ing at file #1, tar will simply ask for the next "volume" when it is ready.
-- Joe Tseng "I can be Googled. Therefore I am."
Per Jessen wrote:
Joe Tseng wrote:
That's what I'm doing now and it seems to be doing what I need. However, going forward, how will someone who will need to get to this be able to access the data?
Remember - tar is Tape ARchive - it was built to deal with archives spanning multiple volumes. When you start untar'ing at file #1, tar will simply ask for the next "volume" when it is ready.
All he needs now, are some 9 track tape drives. ;-)
On Monday 16 May 2005 3:17 pm, James Knott wrote:
All he needs now, are some 9 track tape drives. ;-) No, we need to have random access tape drives (like DECTapes). -- Jerry Feldman
Boston Linux and Unix user group http://www.blu.org PGP key id:C5061EA9 PGP Key fingerprint:053C 73EC 3AC1 5C44 3E14 9245 FB00 3ED5 C506 1EA9
On Monday 16 May 2005 21:36, Joe Tseng wrote:
That's what I'm doing now and it seems to be doing what I need. However, going forward, how will someone who will need to get to this be able to access the data? I saw that the files generated were labeled xaa, xab, xac, etc. Are these just tar files? Can they just be copied to a directory and untarred with "tar xf xa*"?
You have to paste them together first. They're just a split file (it could be anything, the command "split" is not related to archives only). cat xa* >> archive.tar.gz
On Monday 16 May 2005 1:20 pm, Joe Tseng wrote:
I need to archive a directory that happens to be >2GB in size... I already tried to use just plain "tar cf - <dir> | gzip > foobar.tar.gz" but I got an error I suspect due to the filesystem being unable to handle such a large archive. Is there a way for me to create a series of sequential archival files of more manageable size (say ~700MB)? The 2GB limit has been fixed. Another thing, you can use: tar czf ... for .gz and tar cjf for bz2 archives. tar czf foobar.tar.gz <dir> tar cjf foobar.tar.bz2 <dir>
You should not have a problem with the 2.6 kernel (SuSE 9.2+).
For the older systems, take a look at the Large Filesystem Support
information.
http://www.suse.de/~aj/linux_lfs.html
--
Jerry Feldman
On Mon, 2005-05-16 at 13:20 -0400, Joe Tseng wrote:
I need to archive a directory that happens to be >2GB in size... I already tried to use just plain "tar cf - <dir> | gzip > foobar.tar.gz" but I got an error I suspect due to the filesystem being unable to handle such a large archive. Is there a way for me to create a series of sequential archival files of more manageable size (say ~700MB)?
tar has a command line switch to automatically gzip it's output -z and generally the extension is .tgz which signifies a gzip'ed tar file. Perhaps you could be a little more selective when creating the tar file with
Joe Tseng wrote:
I need to archive a directory that happens to be >2GB in size... I already tried to use just plain "tar cf - <dir> | gzip > foobar.tar.gz" but I got an error I suspect due to the filesystem being unable to handle such a large archive. Is there a way for me to create a series of sequential archival files of more manageable size (say ~700MB)?
You might want to read about the tar -L & -M options, for ideas.
Joe: El Lun 16 May 2005 12:20, Joe Tseng escribió:
Is there a way for me to create a series of sequential archival files of more manageable size (say ~700MB)?
For backing up and archiving to multiple slices I like the dar backup utility very much. There even is a very nice graphical frontend for your KDE desktop. You don't say which version of SuSE you are using and I don't know if binary rpm archives are being provided. I have installed both the dar libraries as the kdar graphical frontend from sources. Look here: DAR: http://dar.linux.free.fr/ KDAR: http://kdar.sourceforge.net/ Regards, -- Andreas Philipp Noema Ltda. Bogotá, D.C. - Colombia http://www.noemasol.com
On Mon, May 16, 2005 at 01:20:58PM -0400, Joe Tseng wrote:
I need to archive a directory that happens to be >2GB in size... I already tried to use just plain "tar cf - <dir> | gzip > foobar.tar.gz" but I got an error I suspect due to the filesystem being unable to handle such a large archive. Is there a way for me to create a series of sequential archival files of more manageable size (say ~700MB)?
have a look at rsnapshot http://www.rsnapshot.org/
--
Joe Tseng
"I can be Googled. Therefore I am."
-- David Bear phone: 480-965-8257 fax: 480-965-9189 College of Public Programs/ASU Wilson Hall 232 Tempe, AZ 85287-0803 "Beware the IP portfolio, everyone will be suspect of trespassing"
participants (9)
-
Andreas Philipp
-
David Bear
-
Greg Freemyer
-
James Knott
-
Jerry Feldman
-
Joe Tseng
-
Ken Schneider
-
Per Jessen
-
Silviu Marin-Caea