I need to constantly send files automatically over FTP. Originally I just had a script that fed into stdin for FTP, but realized that for each time I do this, I need to close then reopen the connection between server and client. Reconnection takes more time than actually sending the file. I've tried avoiding this by making a separate script to open the connection and running a script to send the file in a loop.
Here's the issue: after running the connection script, ftp automatically disconnects. Here's the connection script.
#!/bin/bash
HOST='192.168.1.2'
USER='user'
PASSWD='passwd'
echo "open $HOST
user $USER $PASSWD
ascii" > /tmp/ftp.$$
ftp -ivn < /tmp/ftp.$$ >> ftplog.txt
rm /tmp/ftp.$$
and this is the script to send the file.
#!/bin/bash
echo "put localfile.txt remotfile.txt" > /tmp/ftp.$$
ftp -ivn < /tmp/ftp.$$ >> ftplog.txt
rm /tmp/ftp.$$
The connection script opens the connection fine but closes it again once it runs. Any way to avoid this?
I should clarify that I am not uploading a list of files, but a single file that is updated by another script and is sent after update. This one file is sent over and over in as close to real time as possible.
you can upoload all files in a folder with the following script