I am in need of python sftp client to download files from a sftp server. I started to use Paramiko. Small files in KB works well but however when I try to download 600 MB of file, it hangs indefinitely after downloading 20 MB of file. Unable to figure out what the issue is. Increasing the window size did not solve either. Any help would be much appreciated!
host = config.getsafe(section, "host")
username = config.getsafe(section, "username")
port = config.getsafe(section, "port")
remote_dir = config.getsafe(section, "remote_dir")
download_dir = config.getsafe(section, "download_dir")
archive_dir = config.getsafe(section, "archive_dir") if config.has_option(section, "archive_dir") else \
None
password = config.getsafe(section, "password") if config.has_option(section, "password") else None
file_pattern = config.getsafe(section, "file_pattern") if config.has_option(section, "file_pattern") \
else "*"
passphrase = config.getsafe(section, "passphrase") if config.has_option(section, "passphrase") else None
gnupg_home = config.getsafe(section, "gnupg_home") if config.has_option(section, "gnupg_home") else None
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname=host, port=int(port), username=username, password=password)
sftp = ssh.open_sftp()
sftp.sshclient = ssh
sftp.get("/SFTP/PL_DEV/test.dat", "C:/import/download/test.dat")
I did two things to solve a similar problem:
increase window size – you say you tried that too; for me, this helped to get from a few ten MBs to half a GB but no further
effectively disable rekeying – this might have security implications, but helped me to get files over a GB from a weird windows sftp server
Increasing default_max_packet_size and default_window_size as follows worked for me: