Only deleting files over a certain size?
Hi there guys! Is there a way to only delete files in a directory over a certain size? I was hoping to find someway to do it with rm, but I haven't had much luck finding an option to do so yet. Thanks! Shaun Qualheim
Shaun Q wrote:
Is there a way to only delete files in a directory over a certain size? I was hoping to find someway to do it with rm, but I haven't had much luck finding an option to do so yet.
Run e.g. "find <dir> -size +1M -ls" to check what's being deleted. Then: find <dir> -size +1M | xargs rm /Per Jessen, Zürich (2.31 °C) -- http://www.spamchek.com/ - managed anti-spam and anti-virus solution. Let us analyse your spam- and virus-threat - up to 2 months for free.
Hi, On Thursday 19 January 2006 20:19, Per Jessen wrote:
Run e.g. "find <dir> -size +1M -ls" to check what's being deleted.
Then: find <dir> -size +1M | xargs rm
For newer versions of find it's even easier: find <dir> -size +1M -delete Christopher -- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Christopher Hofmann | Tel. 0911/74053 -104 | SuSE R&D - Internal Tools ~~~~~~~~~~~~~~ Current weather in Nuernberg: 0.5°C, Rain ~~~~~~~~~~~~~~
Am Donnerstag, 19. Januar 2006 20:10 schrieb Shaun Q:
Hi there guys!
Is there a way to only delete files in a directory over a certain size? I was hoping to find someway to do it with rm, but I haven't had much luck finding an option to do so yet.
find /path/to/directory -size +1k -exec rm {} \; finds all files bigger than one kbyte in /path/to/directory and executes the command on them, which in the example is rm. other size indicators: b is blocks of 512bytes, c is bytes (characters), w is words (two bytes). bye, MH
On Thu, 19 Jan 2006, Mathias Homann wrote:
Am Donnerstag, 19. Januar 2006 20:10 schrieb Shaun Q:
Hi there guys!
Is there a way to only delete files in a directory over a certain size? I was hoping to find someway to do it with rm, but I haven't had much luck finding an option to do so yet.
find /path/to/directory -size +1k -exec rm {} \;
finds all files bigger than one kbyte in /path/to/directory and executes the command on them, which in the example is rm.
other size indicators: b is blocks of 512bytes, c is bytes (characters), w is words (two bytes).
Please let this be taken as constructive criticism.
Of the three solutions posted, this one is the safest.
One of the other two (Per Jesson's) did not handle situations in which
the name of the file contained spaces in its name. Instead of using:
find <dir> -size +1M | xargs rm
use:
find <dir> -size +1m -print0 | xargs -0 rm
And the other solution which used a script also doesn't properly deal
with spaces, and is far too long besides. ;-)
The only real difference between using -exec (my favored method) and
xargs is that with xargs you can have multiple files deleted per 'rm'
invocation - if you are deleting /lots/ of files this can be an issue.
find is almost unbelievably powerful and is unfortunately one of the
most underutilized commands around.
--
Carpe diem - Seize the day.
Carp in denim - There's a fish in my pants!
Jon Nelson
Jon Nelson wrote:
Please let this be taken as constructive criticism. Of the three solutions posted, this one is the safest. One of the other two (Per Jesson's) did not handle situations in which the name of the file contained spaces in its name. Instead of using:
Very true, thanks. I guess I've made a habit of not using spaces in filenames :-)
The only real difference between using -exec (my favored method) and xargs is that with xargs you can have multiple files deleted per 'rm' invocation - if you are deleting /lots/ of files this can be an issue.
Yes, you could be running out of space on the command-line - 32K is the bash default I think. I started out using -exec some years back, but quickly switched to xargs or piping into 'while read f; do .... ; done'. /Per Jessen, Zürich (0.93 °C) -- http://www.spamchek.com/ - managed anti-spam and anti-virus solution. Let us analyse your spam- and virus-threat - up to 2 months for free.
On 2006-01-20 09:21:21 +0100, Per Jessen wrote:
Yes, you could be running out of space on the command-line - 32K is the bash default I think. I started out using -exec some years back, but quickly switched to xargs or piping into 'while read f; do .... ; done'.
no you cant. xargs takes care that the cmdline wont be too long. darix
Marcus Rueckert schrieb:
On 2006-01-20 09:21:21 +0100, Per Jessen wrote:
Yes, you could be running out of space on the command-line - 32K is the bash default I think. I started out using -exec some years back, but quickly switched to xargs or piping into 'while read f; do .... ; done'.
no you cant. xargs takes care that the cmdline wont be too long.
Not really. Many programs (including grep) have a commandline length limitation which is lower than usual. xargs can't know that and will regularly make grep barf if you use it to the extreme. Regards, Carl-Daniel -- http://www.hailfinger.org/
On 2006-01-20 12:40:35 +0100, Carl-Daniel Hailfinger wrote:
Not really. Many programs (including grep) have a commandline length limitation which is lower than usual. xargs can't know that and will regularly make grep barf if you use it to the extreme.
man xargs: [[[ --max-chars=max-chars, -s max-chars Use at most max-chars characters per command line, including the command and initial-arguments and the terminating nulls at the ends of the argument strings. The default is 131072 characters, not including the size of the environment variables (which are provided for separately so that it doesn't matter if your environment variables take up more than 131072 bytes). The operating system places limits on the values that you can usefully specify, and if you exceed these a warning message is printed and the value actually used is set to the appropriate upper or lower limit. ]]] darix
Hi, On Thu, 19 Jan 2006, Shaun Q wrote:
Is there a way to only delete files in a directory over a certain size? I was hoping to find someway to do it with rm, but I haven't had much luck finding an option to do so yet.
You have asked at the wrong place, but find -type f -printf "%s %p\n" | \ ( while read a b do if [ $a -gt 1000000 ]; then echo $a $b rm "$b" fi done ) Cheers -e -- Eberhard Moenkeberg (emoenke@gwdg.de, em@kki.org)
participants (8)
-
Carl-Daniel Hailfinger
-
Christopher Hofmann
-
Eberhard Moenkeberg
-
Jon Nelson
-
Marcus Rueckert
-
Mathias Homann
-
Per Jessen
-
Shaun Q