I am writing a little shellscript that needs to go through all folders and files on an ftp server (recursively). So far everything works fine using cURL - but it's pretty slow, becuase cURL starts a new session for every command. So for 500 directories, cURL preforms 500 logins.
Does anybody know, whether I can stay logged in using cURL (this would be my favourite solution) or how I can use ftp with only one session in a shell script?
I know how to execute a set of ftp commands and retrieve the response, but for the recursive listing, it has to be a little more dynamic...
Thanks for your help!
The command is actually ncftpls -R
. It will recursively list all the files in a ftp folder.
If it's possible, try usign lftp script:
# lftp script "myscript.lftp"
open your-ftp-host
user username password
cd directory_with_subdirs_u_want_to_list
find
exit
Next thing u need is bash script to run this lftp command and write it to file:
#!/bin/bash
lftp -f myscript.lftp > myOutputFile
myOutputFile now contains the full dump of directories.
Just to summarize what others have said so far. If you are trying to write a portable shell script which works as batch file, then you need to use the lftp solution since some FTP server may not implement ls -R
. Simply replace 123.456.789.100 with the actual IP adress of the ftp server in the following examples:
$ lftp -c "open 123.456.789.100 && find -l && exit" > listing.txt
See the man
page of lftp, go to the find
section:
List files in the directory (current directory by default)
recursively. This can help with servers lacking ls -R support. You
can redirect output of this command.
However if you have a way to figure out whether or not the remote ftp server implements proper support for ls -lR
, then a much better (=faster) solution will be:
$ echo ls -lR | ftp 123.456.789.100 > listing.txt
Just for reference if I execute the first command (lftp+find) it takes 0m55.384s to retrieve the full listing, while if I execute the second one (ftp+ls-R), it takes 0m3.225s.
You could connect to the ftp server in a manner that it accepts commands from stdin and writes to stdout. Create two named pipes ("fifos", man mkfifo
), redirect stdin and stdout of the ftp command each to one of them. Then you can write commands to the stdin-connected-fifo and read them (line-by-line with bash's read
for example) from the stdout-fifo. Then use the results to see where you need to send another listing command (and print it or whatever you want to do)
In short: Not something bash scripting is suitable for :) (Until you find a tool that does what you want by itself of course)
if you just want to create a listing of all files and folders, you can use ssh
instead. Something like this (but check the documentation on correct usage)
$ ssh user@host "ls -R /path"