I've written an FTP script that must unfortunately deal with a server that's behind a firewall. The ISP also cuts of my control connection quite early, no matter what kind of timeout settings I might set on either side of the firewall. I've finally come to two choices. 1) fork the code at the retrbinary command so that a 'NOOP' is sent every n seconds while the download completes, or 2) fork the code at the retrbinary command so that the script compares the local file's size against the remote file's size. Once the sizes are different, it attempts to download the next file.
I've included some code below that exemplifies my attempt to have a 'NOOP' sent every n seconds. The problem is that when the script gets to the isAlive while loop and sends executes ftp.sendcmd('NOOP'), things hang and the while loop locks up. The file is still downloading on TCP port 63xxx, but my control connection on port 21 is locked up.
As far as checking the file size during download using a thread, that failed because the file size didn't quite reach the full value and just sat at the same value at the very end of the download.
The main issue right now keeping my control connection alive, my any means possible, throughout large downloads. If the control connection is killed for some reason, my script needs to have a good way to know when the file transfer has finished. Without a control connection, it has no way of knowing from server responses.
Thanks in advance.
def FTP_RETR(ftp_transfer_path, f, start_byte):
ftp.retrbinary('RETR %s' % ftp_transfer_path, f.write, rest = '%s' % start_byte)
f.close()
ftp = FTP()
ftp.connect(FTP_server, FTP_port)
ftp.login(FTP_username, FTP_password)
ftp.cwd(home_dir)
ftp_transfer_path = 'file_to_get.dat'
start_byte = 0
filename = 'C:\\myfile.txt'
f = open('C:\\myfile.txt', 'wb')
ftp.sendcmd('TYPE I')
transfer_thread = threading.Thread(target = FTP_RETR, args = (ftp_transfer_path, f, start_byte))
transfer_thread.start()
while transfer_thread.isAlive():
time.sleep(5)
ftp.sendcmd('NOOP')
print 'Finished'