On Mon, 28 Nov 2005, david rankin wrote:
----- Original Message ----- From: "Ken Schneider" <suse-list@bout-tyme.net>
the html title gets used as the filename. I need a way to search the sever and identify the files that are too long so I can change them without having to search manually.
Does anybody know of a tool that will do this?
I have used "growisofs -dry-run ..." to find them one by one, but this is quite cumbersome.
It's basically called an email informing them that any files saved in that manner cannot be backed up due to the limitations of the backup software and will not be available for restore should they get deleted/corrupted. This will force them to start using names that are more acceptable. Teach them the correct way to name files rather than trying to baby sit the way they do things now by changing the file names for them. They are after all grownups are they not?
Amen Ken,
Already done. The only problem is how do I easily find the recent additions that are causing the problems. The kicker, for now, is that the screwed up file names don't just bomb the backup on those files, it kills the whole backup. So now I have to find the offending files and rename/delete them. If it happens again, the rm -f option will be employed.
I was thinking about a BASH script that would get a listing of the filenames and then parse the file names for length, etc. and then output the list of long files to a text file. I was hoping (praying) that somebody may have run across this issue before and may have an old script laying around. I'll keep pecking around...
This should find files, directories with a name of at least 50 characters. Change the regex and other details to suit. find ~bozo | egrep '/.{50,9999}/' You could also concoct a regex looking for unsuitable (note that they're not illegal to Linux) characters.