I'm trying create a list of all files (and their sizes) in a directory including everything within the sub-directories.
The files are on a remote server. So my script connects through FTP, then runs a recursive function using ftp_chdir
to go through each directory.
If there's another way to do this, I'm open to suggestions.
$flist = array();
function recursive_list_dir($conn_id, $dir, $parent = "false") {
global $flist;
ftp_chdir($conn_id, $dir) or die("Fudgeballs: ".$parent."/".$dir);
$list = array();
$list = ftp_rawlist($conn_id, ".");
if($parent != "false") { $dir = $parent."/".$dir; }
for($x = 0; $x < count($list); $x++) {
$list_details = preg_split("/[\s]+/", $list[$x]);
$file = $list_details[3];
$size = $list_details[2];
if(!strstr($file, ".")) { // if there's no dot (.), then we assume it's a directory (is there a command similar to "is_dir" for FTP? that would be more fail proof?)
recursive_list_dir($conn_id, $file, $dir);
}
else { $flist[] = $dir."@".$file."@".$size; }
}
ftp_chdir($conn_id, "..");
}
recursive_list_dir($conn_id, ".");
The script worked fine up to a point, but now it's not working. The PHP returns an error with ftp_chdir
. The only thing that changed is that we've added more files to the server. The script works if I run it on a sub-directory. But if I run it on "." it fails. So is this failing because there are too many files and sub-directories?
I haven't tested this out, but here's how I did it a while back:
Usage of globals in PHP is not good practice. See this:
The fact that it was working before you gave it more inputs seems to me to suggest that it might be a problem there. Try putting a
set_time_limit(300);
at the top which will allow it to run for 5 minutes before timing out and see if that fixes the problem.A real recursive solution that does not use global variables:
Works for FTP servers that use a common *nix-style listing like:
Won't work for files with a space in its name.