On 3/31/2012 9:51 PM, John Andersen wrote:
On 3/31/2012 5:03 PM, Brian K. White wrote:
On 3/31/2012 2:06 PM, Andreas wrote:
Hi,
I'd like to let cron clean up a directory that holds hourly backups. So there are 24 new files every day.
I'd need some scripting that sorts the content of this directory and then removes everything but the 5 newest files.
Is there a howto or even sample code that does this?
find /backups -mmin +300 -delete
(find everything in or under /backups whose modify time is older than 300 minutes and delete it)
Ouch. Run tat tomorrow and Poof! All gone.
Wrong. He's creating new files every hour so there will always be 5 new files. But you are right it does depend on the cron job that's creating a new file every hour to keep working. But that's why this shouldn't be it's own separate cron job but should be a command added to the end of the backup job that creates the files, and make it conditional on the success of the create-file part. If the backup script isn't run, then neither would the find/delete part of it. If the backup script ran but the create-file part failed, then don't run the find/delete part. The ls -T based approach would be safer in that it would keep the 5 newest files regardless how old they happen to be. -- bkw -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org