Python read file into memory for repeated FTP copy

2019-09-09 22:49发布

问题:

I need to read a local file and copy to remote location with FTP, I copy same file file.txt to remote location repeatedly hundreds of times with different names like f1.txt, f2.txt... f1000.txt etc. Now, is it necessary to always open, read, close my local file.txt for every single FTP copy or is there a way to store into a variable and use that all time and avoid file open, close functions. file.txt is small file of 6KB. Below is the code I am using

for i in range(1,101):
    fname = 'file'+ str(i) +'.txt'
    fp = open('file.txt', 'rb')
    ftp.storbinary('STOR ' + fname, fp)
    fp.close()

I tried reading into a string variable and replace fp but ftp.storbinary requires second argument to have method read(), please suggest if there is better way to avoid file open close or let me know if it has no performance improvement at all. I am using python 2.7.10 on Windows 7.

回答1:

Simply open it before the loop, and close it after:

fp = open('file.txt', 'rb')

for i in range(1,101):
    fname = 'file'+ str(i) +'.txt'
    fp.seek(0)
    ftp.storbinary('STOR ' + fname, fp)

fp.close()  

Update Make sure you add fp.seek(0) before the call to ftp.storbinary, otherwise the read call will exhaust the file in the first iteration as noted by @eryksun.

Update 2 depending on the size of the file it will probably be faster to use BytesIO. This way the file content is saved in memory but will still be a file-like object (ie it will have a read method).

from io import BytesIO

with open('file.txt', 'rb') as f:
    output = BytesIO()
    output.write(f.read())

for i in range(1, 101):
    fname = 'file' + str(i) + '.txt'
    output.seek(0)
    ftp.storbinary('STOR ' + fname, fp)