I have about a ton of scripts using the following function:
# Copies files over using FTP.
# Configurations set at the beggining of the script
# @param $1 = FTP Host
# $2 = FTP User
# $3 = FTP User password
# $4 = Source file name
# $5 = destination directory
# $6 = local directory
doftp() {
log_message_file "INFO" "Starting FTP"
ftp_hst=$1
ftp_usr=$2
ftp_pwd=$3
sourcefile=$4
destdir=$5
locdir=$6
ftp -nv $FTPH << EOF 2> ftp.err.$$
quote USER $ftp_usr
quote PASS $ftp_pwd
cd $destdir
lcd $locdir
bin
put $sourcefile
bye
EOF
if [ "$(wc ftp.err.$$|cut -d" " -f8)" != 0 ] ; then
log_message_file "ERROR" "Problem uploading files: $(cat ftp.err.$$)"
else
log_message_file "INFO" "FTP finished"
fi
rm ftp.err.$$
}
It works, does it's job, unless ftp fails. Fortunately for me, the scripts are quite precise and the FTP almost never fails. But this is one of these rare moments when one has the chance (time) to go back and review code marked in the TODO list. Only problem is that I'm not too sure on how to improve it... I'd take you guys recommendations on what to change there.
One obvious problem is the error parsing from the ftp, which is a totally lame. But I'll also take thoughts in other parts of the function :)
Worth mentioning this is run in a AIX server? Oh, and no, I cannot use SFTP :(
Thanks for any input!
ps.: log_message_file is just a basic logging... has no effect in the function.