Howto delet file automaticly base on periode.
Hi all, I have made a script for backup purpose, so it will backup my folder daily and creates files like this : -rw-r--r-- 1 root root 97751911 Apr 25 17:27 stock-25-Apr-2005_17-25-33.tgz -rw-r--r-- 1 root root 97754710 Apr 26 17:11 stock-26-Apr-2005_17-00-00.tgz Then after certain period, like weekly or monthly, I want to delete the oldest file. Now, how can I make a script to do that job ? I know the crontab can do it, but how can I remove just the oldest file automaticly, not whole file or directory ? regards, -- Arie Reynaldi Zanahar reymanx at gmail.com http://www.reynaldi.or.id
Arie, here is a script I made which I think takes your process to a
much more in depth level. This script is tried and true, I've been
running it on several servers now, one of which for more than four
years.
# cat ./bin/salvagesaver
#! /bin/sh
# This job is meant to be run from cron on a frequency
of x minutes.
#
# This is a Salvage Facility similar to, but not quite
the same as, found in Novell Netware.
# Every x minutes it copies all files changed in the
last x minutes to a file in the
# salvage directory with the same name and a suffix of
_SaLvAgECCYYMMDDTTHHSS.
# It then deletes any salvage files older than y number of days.
#
# We figure empty spinning disk is a waste of resources.
Don't run this on a system
# tight for space where frequent changes to massive
files are being made.
#
# Not exactly like Novell, in that if you simply walk up
and delete a ton of files they
# will be lost. But that's what backup tapes are for.
If you are making incremental
# changes to a file this will snapshot it every x
minutes, so that when you finally
# screw up (and you WILL screw up) you need only look at
the salvage files and copy
# the most current one.
#
# By the way, if this thing screws up and deletes your
entire directory structure
# melts your processor, and toasts your chips, don't
come crying to us. It must be
# something _YOU_ did. We had nothing to do with it.
#
# Set some variables
# ProtectPath is the path
forwhich you need salvage protection.
# IT IS NOT RECOMMENDED THAT YOU
START FROM ROOT !!!
# IT IS NOT RECOMMENDED THAT IT
INCLUDE ANY SYSTEM AREAS (LINUX OS DIR
S ETC).
PROTECTPATH=/data/
PROTECTSHORTPATH=/data/qbtimer/
PROTECTSHORTPATH2=/data/Mail/
#
# SALVAGE_ROOT is the path for
which you will create the salvaged
# files into. All directory
paths are preserved starting at the
# level of "SALVAGE_ROOT". Be
sure NOT to include a trailing slash
# character, the "%h" variable
does this in the find statement.
SALVAGE_ROOT=/SalvageSaver
#
#
# Cycletime is the interval
between executions via cron
# and how far back we should look
for changed files
# This is done this way to
prevent replicating the entire file system
# if the "touch file" were missing
CYCLEMINS=5
# RetainDays is the number of
days a salvage file is retained. Usually
# just long enough to get it on a
backup tape is sufficient.
RETAINDAYS=8
RETAINMINS=720
#
#
# Here we find all non-salvage files modified in the
last cycletime minutes and make a file of CP commands
find $PROTECTPATH ! -regex '.*SaLvAgE.*' -type f -mmin -$CYCLEMINS
-fprintf ~/SaLvAgE
mkdir\\t\-p\\t\"$SALVAGE_ROOT\%h\"\\ncp\\t\"%p\"\\t\"$SALVAGE_ROOT\%h\/%f\_%TY%Tm%Td%TH%TM%TS\.SaLvAgE\"\\n
# Make the cp command file executable
chmod 700 ~/SaLvAgE
# Run the cp commands
~/SaLvAgE
# now delete any salvage files older than retain days
#find $PROTECTPATH -regex '.*SaLvAgE.*' -type f -mtime +$RETAINDAYS
-exec rm {} \;
find $SALVAGE_ROOT -regex '.*SaLvAgE.*' -type f -mtime +$RETAINDAYS
-exec rm {} \;
#find $PROTECTSHORTPATH -regex '.*SaLvAgE.*' -type f -mmin
+$RETAINMINS -exec rm {} \;
#find $PROTECTSHORTPATH2 -regex '.*SaLvAgE.*' -type f -mmin
+$RETAINMINS -exec rm {} \;
#done
You will have to re-edit the script to setup the correct pathing as
well as fixing the lines from this email from word-wrap.
If you're happy w/ your existing script for creating the tar files,
just modify this script and append the section for removing old files
to your own.
The "regex '.*SaLvAgE.*" sections in the find statements are leftovers
from a previous method I used where I simply created copies of the
salvaged files to the EXACT same path as the original but also made it
hidden with an extension of .SaLvAgE
This method is no longer used but I didn't want to lose the script
portion for search exclusions and found that it didn't make a
difference on my stopwatch for the find command to process.
Good luck.
On 4/26/05, Arie Reynaldi Z
Hi all,
I have made a script for backup purpose, so it will backup my folder daily and creates files like this : -rw-r--r-- 1 root root 97751911 Apr 25 17:27 stock-25-Apr-2005_17-25-33.tgz -rw-r--r-- 1 root root 97754710 Apr 26 17:11 stock-26-Apr-2005_17-00-00.tgz
Then after certain period, like weekly or monthly, I want to delete the oldest file. Now, how can I make a script to do that job ? I know the crontab can do it, but how can I remove just the oldest file automaticly, not whole file or directory ?
regards,
-- Arie Reynaldi Zanahar reymanx at gmail.com http://www.reynaldi.or.id
-- Check the headers for your unsubscription address For additional commands send e-mail to suse-linux-e-help@suse.com Also check the archives at http://lists.suse.com Please read the FAQs: suse-linux-e-faq@suse.com
-- Thanks, Dan Registered Linux User #373395
Arie Reynaldi Z wrote:
Hi all,
I have made a script for backup purpose, so it will backup my folder daily and creates files like this : -rw-r--r-- 1 root root 97751911 Apr 25 17:27 stock-25-Apr-2005_17-25-33.tgz -rw-r--r-- 1 root root 97754710 Apr 26 17:11 stock-26-Apr-2005_17-00-00.tgz
Then after certain period, like weekly or monthly, I want to delete the oldest file. Now, how can I make a script to do that job ? I know the crontab can do it, but how can I remove just the oldest file automaticly, not whole file or directory ?
In a script, simply have a list of copy instructions such as cp ver9 ver10 cp ver8 ver9 . . . cp ver1 ver2
Op woensdag 27 april 2005 04:27, schreef Arie Reynaldi Z:
Hi all,
I have made a script for backup purpose, so it will backup my folder daily and creates files like this : -rw-r--r-- 1 root root 97751911 Apr 25 17:27 stock-25-Apr-2005_17-25-33.tgz -rw-r--r-- 1 root root 97754710 Apr 26 17:11 stock-26-Apr-2005_17-00-00.tgz
Then after certain period, like weekly or monthly, I want to delete the oldest file. Now, how can I make a script to do that job ? I know the crontab can do it, but how can I remove just the oldest file automaticly, not whole file or directory ?
regards,
-- Arie Reynaldi Zanahar reymanx at gmail.com http://www.reynaldi.or.id
Perhaps, something similar to: find ${BACKUP_DIR:-/tmp/000} -name "*daily*" \ -mtime +7 -exec /usr/bin/safe-rm {} \; # Remove the directory when it is empty find ${BACKUP_DIR:-/tmp/000} -depth -type d -empty -mindepth 1 \ -exec /usr/bin/safe-rmdir {} \; -- Richard Bos Without a home the journey is endless
Guys, This list is the best ! I got so many input to do my job, I haven't try it all, but surely I'll do it in a while. Another thing, I just got another python script from a friend to do the backup - erase, and i'd like to share it. :) import glob import os dir = '/home3/division/' files = glob.glob('%s/*' % dir) files.sort() for file in files[:-7]: command = 'rm %s' % file print command os.system( command ) regards, and thanks to you all.. :) -- Arie Reynaldi Zanahar reymanx at gmail.com http://www.reynaldi.or.id
Arie Reynaldi Z wrote:
Guys,
This list is the best ! I got so many input to do my job, I haven't try it all, but surely I'll do it in a while. Another thing, I just got another python script from a friend to do the backup - erase, and i'd like to share it. :)
import glob import os dir = '/home3/division/' files = glob.glob('%s/*' % dir) files.sort() for file in files[:-7]:
Is this a little dangerous? It looks like it will delete the last seven files in the dir - every time you run it! Surely you want to put a date restriction on it so it won't delete newer files than you should? Or maybe delete the oldest files but KEEP at least x of them. I didn't read any earlier posts on this so maybe I'm barking up the wrong tree :) Mike
command = 'rm %s' % file print command os.system( command )
regards, and thanks to you all.. :)
import glob import os dir = '/home3/division/' files = glob.glob('%s/*' % dir) files.sort() for file in files[:-7]:
Is this a little dangerous? It looks like it will delete the last seven files in the dir - every time you run it!
Surely you want to put a date restriction on it so it won't delete newer files than you should?
Or maybe delete the oldest files but KEEP at least x of them.
I didn't read any earlier posts on this so maybe I'm barking up the wrong tree :)
This way, i think, i still can manage some of my backup files base on how many times the system backup, regardless when they're backup. My backup script not always run every day, one script runs just weekdays and another is twice a week. But in another case, I will use one of op's sugestions. regards, -- Arie Reynaldi Zanahar reymanx at gmail.com http://www.reynaldi.or.id
participants (5)
-
Arie Reynaldi Z
-
Dan Phillips
-
James Knott
-
Mike Dewhirst
-
Richard Bos