On Tuesday 09 July 2002 20:20, henjay wrote:
Hello all,
I'm looking for a script that will do the following.
1: Connect to a FTP site.
2: Connect automatically either using cron or running the script manually (username and password in the script or in seperate file).
3: Download an "X" number of files. (The file names are already known) 3a: Download an "X" number of files but this time the file names are not known. 4: Backup the already existing files and copy the files to a backup directory. The name of the directory is the name of the file including the date.
5: Copy the new files from the FTP site to the specified directory.
And last but not least. Is there a good site where I can go too to learn Shell scripting and also a good book for someone who has never programmed before?
Sorry, but I think the other suggestions are overkill, although I do use wget
alot. I do the exact same thing at work to update software on various Solaris
and Linux machines. You can use the .netrc to automatically login and execute
whatever commands you want. This is all in bash and all reliable as it has
been running through cron for about 2 years with no problems.
Below is the complete version. The key is that you define a macro that is
started when the connection is made (macdef init). This version only gets
gets a single file, but I have other versions that copy multiple files. The
problem with 3a is that you need some way to figure out what the file names
are, just as using wild cards. The rest of your requirements can be added to
the appropriate locations in the script.
Regards,
jimmo
#!/bin/bash
# This script grabs the compiled JAVA programs and pulls
# them to this machine where they are extracted.
# Directory where the *class fiiles are (local)
VB_CLASS=/usr/local/appserver/vb_class
# Directory where the *class files are copeid to.
VB_SERVER=/usr/local/iob
# Archive directory
VB_ARCHDIR="$VB_SERVER/xfer"
# OLD
VB_ARCHNAME="$VB_ARCHDIR/vb_appserver.tar.gz"
VB_ARCHFILES="de"
# Auf welchem Rechner liegt das Archiv
ARCHSERV=