I have a directory with files, directories, subdirectories, etc. How I can get the list of absolute paths to all files and directories using the Apache Hadoop API?
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
回答1:
Using HDFS API :
package org.myorg.hdfsdemo;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class HdfsDemo {
public static void main(String[] args) throws IOException {
Configuration conf = new Configuration();
conf.addResource(new Path("/Users/miqbal1/hadoop-eco/hadoop-1.1.2/conf/core-site.xml"));
conf.addResource(new Path("/Users/miqbal1/hadoop-eco/hadoop-1.1.2/conf/hdfs-site.xml"));
FileSystem fs = FileSystem.get(conf);
System.out.println("Enter the directory name :");
BufferedReader br = new BufferedReader(new InputStreamReader(System.in));
Path path = new Path(br.readLine());
displayDirectoryContents(fs, path);
}
private static void displayDirectoryContents(FileSystem fs, Path rootDir) {
// TODO Auto-generated method stub
try {
FileStatus[] status = fs.listStatus(rootDir);
for (FileStatus file : status) {
if (file.isDir()) {
System.out.println("This is a directory:" + file.getPath());
displayDirectoryContents(fs, file.getPath());
} else {
System.out.println("This is a file:" + file.getPath());
}
}
} catch (IOException e) {
e.printStackTrace();
}
}
}
回答2:
Writer a recursive function which takes a file and check if its a directory or not, if directory list out all files in it and in a for loop check if the file is a directory then recursively call or just return the list of files.
Something like this below but not exactly same (here I am returning only .java files)
private static List<File> recursiveDir(File file) {
if (!file.isDirectory()) {
// System.out.println("[" + file.getName() + "] is not a valid directory");
return null;
}
List<File> returnList = new ArrayList<File>();
File[] files = file.listFiles();
for (File f : files) {
if (!f.isDirectory()) {
if (f.getName().endsWith("java")) {
returnList.add(f);
}
} else {
returnList.addAll(recursiveDir(f));
}
}
return returnList;
}
回答3:
with hdfs you can use hadoop fs -lsr .