On Mon, Apr 24, 2000 at 20:48 +0200, Peter Münster wrote:
Excuse-me please! It seems, that my little patch was not a so good idea! So, thanks to Pavel Kankovsky and to James Antill! Peter
But even if "find ... -exec rm" or "find ... | xargs rm" is not correct, "for FILE in `find ...`" isn't either. There *must* be something done to make it work for larger file sets. But all I can come up with is something in the form of ls_or_find | while read FILE; do operate_on $FILE done but it still is a lengthy process concerning time. What's The Right Way(TM) to go? I feel this kind of problem should have been around often enough so that there could be proven solutions. I guess the above mentioned "operate_on" procedure has to be a little more complex to ensure that $FILE is still inside the tree the ls/find operation initially started on (eliminating tricks like "some/../path" etc) and the find operation has to avoid following links. Is there a "clever" way to first collect the file names and then process them at once (maybe with a sanity check before and after collection with dropping results upon mismatch)? I could only think of a list file which opens up another can of worms :( virtually yours 82D1 9B9C 01DC 4FB4 D7B4 61BE 3F49 4F77 72DE DA76 Gerhard Sittig true | mail -s "get gpg key" Gerhard.Sittig@gmx.net -- If you don't understand or are scared by any of the above ask your parents or an adult to help you.