Mates, I manage my own server with 10 clients. The 'not so intelligent' folks I work with have saved file names that cannot be backed up with growisofs, etc.., because the file names are too long and violate the Joliet and Rock-Ridge conventions. This usually happens when they save web pages and the html title gets used as the filename. I need a way to search the sever and identify the files that are too long so I can change them without having to search manually. Does anybody know of a tool that will do this? I have used "growisofs -dry-run ..." to find them one by one, but this is quite cumbersome. -- David C. Rankin, J.D., P.E. RANKIN LAW FIRM, PLLC 510 Ochiltree Street Nacogdoches, Texas 75961 (936) 715-9333 (936) 715-9339 fax www.rankinlawfirm.com --
On Mon, 2005-11-28 at 14:05 -0600, david rankin wrote:
Mates,
I manage my own server with 10 clients. The 'not so intelligent' folks I work with have saved file names that cannot be backed up with growisofs, etc.., because the file names are too long and violate the Joliet and Rock-Ridge conventions. This usually happens when they save web pages and the html title gets used as the filename. I need a way to search the sever and identify the files that are too long so I can change them without having to search manually.
Does anybody know of a tool that will do this?
I have used "growisofs -dry-run ..." to find them one by one, but this is quite cumbersome.
It's basically called an email informing them that any files saved in that manner cannot be backed up due to the limitations of the backup software and will not be available for restore should they get deleted/corrupted. This will force them to start using names that are more acceptable. Teach them the correct way to name files rather than trying to baby sit the way they do things now by changing the file names for them. They are after all grownups are they not? -- Ken Schneider UNIX since 1989, linux since 1994, SuSE since 1998
----- Original Message ----- From: "Ken Schneider" <suse-list@bout-tyme.net>
the html title gets used as the filename. I need a way to search the sever and identify the files that are too long so I can change them without having to search manually.
Does anybody know of a tool that will do this?
I have used "growisofs -dry-run ..." to find them one by one, but this is quite cumbersome.
It's basically called an email informing them that any files saved in that manner cannot be backed up due to the limitations of the backup software and will not be available for restore should they get deleted/corrupted. This will force them to start using names that are more acceptable. Teach them the correct way to name files rather than trying to baby sit the way they do things now by changing the file names for them. They are after all grownups are they not?
Amen Ken, Already done. The only problem is how do I easily find the recent additions that are causing the problems. The kicker, for now, is that the screwed up file names don't just bomb the backup on those files, it kills the whole backup. So now I have to find the offending files and rename/delete them. If it happens again, the rm -f option will be employed. I was thinking about a BASH script that would get a listing of the filenames and then parse the file names for length, etc. and then output the list of long files to a text file. I was hoping (praying) that somebody may have run across this issue before and may have an old script laying around. I'll keep pecking around... -- David C. Rankin, J.D., P.E. RANKIN LAW FIRM, PLLC 510 Ochiltree Street Nacogdoches, Texas 75961 (936) 715-9333 (936) 715-9339 fax www.rankinlawfirm.com --
On Mon, 28 Nov 2005, david rankin wrote:
----- Original Message ----- From: "Ken Schneider" <suse-list@bout-tyme.net>
the html title gets used as the filename. I need a way to search the sever and identify the files that are too long so I can change them without having to search manually.
Does anybody know of a tool that will do this?
I have used "growisofs -dry-run ..." to find them one by one, but this is quite cumbersome.
It's basically called an email informing them that any files saved in that manner cannot be backed up due to the limitations of the backup software and will not be available for restore should they get deleted/corrupted. This will force them to start using names that are more acceptable. Teach them the correct way to name files rather than trying to baby sit the way they do things now by changing the file names for them. They are after all grownups are they not?
Amen Ken,
Already done. The only problem is how do I easily find the recent additions that are causing the problems. The kicker, for now, is that the screwed up file names don't just bomb the backup on those files, it kills the whole backup. So now I have to find the offending files and rename/delete them. If it happens again, the rm -f option will be employed.
I was thinking about a BASH script that would get a listing of the filenames and then parse the file names for length, etc. and then output the list of long files to a text file. I was hoping (praying) that somebody may have run across this issue before and may have an old script laying around. I'll keep pecking around...
This should find files, directories with a name of at least 50 characters. Change the regex and other details to suit. find ~bozo | egrep '/.{50,9999}/' You could also concoct a regex looking for unsuitable (note that they're not illegal to Linux) characters.
On Mon, 28 Nov 2005, david rankin wrote:
Mates,
I manage my own server with 10 clients. The 'not so intelligent' folks I work with have saved file names that cannot be backed up with growisofs, etc.., because the file names are too long and violate the Joliet and Rock-Ridge conventions. This usually happens when they save web pages and the html title gets used as the filename. I need a way to search the sever and identify the files that are too long so I can change them without having to search manually.
Does anybody know of a tool that will do this?
I have used "growisofs -dry-run ..." to find them one by one, but this is quite cumbersome.
growisofs uses mkisofs; you could use mkisofs and not need to insert a DVD. You could use mkisofs on each user's ~ and maybe find several in one pass. If these files occur in web browsers' cache directories, you could exclude those directly: nobody's going to want those restored. A little more painfully, you could use tar or cpio to create an archive and burn that to DVD. You can actually burn a tarball to CD or DVD, just as you would an ISO image. You can't mount it, you read the device node directly to retrieve its contents. I personally would look at making a tarball for each user, and an ISO image containing those. Retrieving individual files is a little more complicated that just popping the disk into the drive and reading it, but it has advantages including overcoming your filename problem, (maybe) employing compression. You could also contemplate a backup server, where you copy users' date regularly with rsync. Check the rsync home page for Handy Hints.
-- David C. Rankin, J.D., P.E. RANKIN LAW FIRM, PLLC 510 Ochiltree Street Nacogdoches, Texas 75961 (936) 715-9333 (936) 715-9339 fax www.rankinlawfirm.com
participants (3)
-
david rankin
-
John Summerfield
-
Ken Schneider