Arie, here is a script I made which I think takes your process to a
much more in depth level. This script is tried and true, I've been
running it on several servers now, one of which for more than four
years.
# cat ./bin/salvagesaver
#! /bin/sh
# This job is meant to be run from cron on a frequency
of x minutes.
#
# This is a Salvage Facility similar to, but not quite
the same as, found in Novell Netware.
# Every x minutes it copies all files changed in the
last x minutes to a file in the
# salvage directory with the same name and a suffix of
_SaLvAgECCYYMMDDTTHHSS.
# It then deletes any salvage files older than y number of days.
#
# We figure empty spinning disk is a waste of resources.
Don't run this on a system
# tight for space where frequent changes to massive
files are being made.
#
# Not exactly like Novell, in that if you simply walk up
and delete a ton of files they
# will be lost. But that's what backup tapes are for.
If you are making incremental
# changes to a file this will snapshot it every x
minutes, so that when you finally
# screw up (and you WILL screw up) you need only look at
the salvage files and copy
# the most current one.
#
# By the way, if this thing screws up and deletes your
entire directory structure
# melts your processor, and toasts your chips, don't
come crying to us. It must be
# something _YOU_ did. We had nothing to do with it.
#
# Set some variables
# ProtectPath is the path
forwhich you need salvage protection.
# IT IS NOT RECOMMENDED THAT YOU
START FROM ROOT !!!
# IT IS NOT RECOMMENDED THAT IT
INCLUDE ANY SYSTEM AREAS (LINUX OS DIR
S ETC).
PROTECTPATH=/data/
PROTECTSHORTPATH=/data/qbtimer/
PROTECTSHORTPATH2=/data/Mail/
#
# SALVAGE_ROOT is the path for
which you will create the salvaged
# files into. All directory
paths are preserved starting at the
# level of "SALVAGE_ROOT". Be
sure NOT to include a trailing slash
# character, the "%h" variable
does this in the find statement.
SALVAGE_ROOT=/SalvageSaver
#
#
# Cycletime is the interval
between executions via cron
# and how far back we should look
for changed files
# This is done this way to
prevent replicating the entire file system
# if the "touch file" were missing
CYCLEMINS=5
# RetainDays is the number of
days a salvage file is retained. Usually
# just long enough to get it on a
backup tape is sufficient.
RETAINDAYS=8
RETAINMINS=720
#
#
# Here we find all non-salvage files modified in the
last cycletime minutes and make a file of CP commands
find $PROTECTPATH ! -regex '.*SaLvAgE.*' -type f -mmin -$CYCLEMINS
-fprintf ~/SaLvAgE
mkdir\\t\-p\\t\"$SALVAGE_ROOT\%h\"\\ncp\\t\"%p\"\\t\"$SALVAGE_ROOT\%h\/%f\_%TY%Tm%Td%TH%TM%TS\.SaLvAgE\"\\n
# Make the cp command file executable
chmod 700 ~/SaLvAgE
# Run the cp commands
~/SaLvAgE
# now delete any salvage files older than retain days
#find $PROTECTPATH -regex '.*SaLvAgE.*' -type f -mtime +$RETAINDAYS
-exec rm {} \;
find $SALVAGE_ROOT -regex '.*SaLvAgE.*' -type f -mtime +$RETAINDAYS
-exec rm {} \;
#find $PROTECTSHORTPATH -regex '.*SaLvAgE.*' -type f -mmin
+$RETAINMINS -exec rm {} \;
#find $PROTECTSHORTPATH2 -regex '.*SaLvAgE.*' -type f -mmin
+$RETAINMINS -exec rm {} \;
#done
You will have to re-edit the script to setup the correct pathing as
well as fixing the lines from this email from word-wrap.
If you're happy w/ your existing script for creating the tar files,
just modify this script and append the section for removing old files
to your own.
The "regex '.*SaLvAgE.*" sections in the find statements are leftovers
from a previous method I used where I simply created copies of the
salvaged files to the EXACT same path as the original but also made it
hidden with an extension of .SaLvAgE
This method is no longer used but I didn't want to lose the script
portion for search exclusions and found that it didn't make a
difference on my stopwatch for the find command to process.
Good luck.
On 4/26/05, Arie Reynaldi Z
Hi all,
I have made a script for backup purpose, so it will backup my folder daily and creates files like this : -rw-r--r-- 1 root root 97751911 Apr 25 17:27 stock-25-Apr-2005_17-25-33.tgz -rw-r--r-- 1 root root 97754710 Apr 26 17:11 stock-26-Apr-2005_17-00-00.tgz
Then after certain period, like weekly or monthly, I want to delete the oldest file. Now, how can I make a script to do that job ? I know the crontab can do it, but how can I remove just the oldest file automaticly, not whole file or directory ?
regards,
-- Arie Reynaldi Zanahar reymanx at gmail.com http://www.reynaldi.or.id
-- Check the headers for your unsubscription address For additional commands send e-mail to suse-linux-e-help@suse.com Also check the archives at http://lists.suse.com Please read the FAQs: suse-linux-e-faq@suse.com
-- Thanks, Dan Registered Linux User #373395