Accessing files in HDFS using Java

2020-02-26 11:18发布

I am trying to access a file in the HDFS using Java APIs, but everytime I am getting File Not Found. Code which I am using to access is :-

Configuration conf = new Configuration();
    conf.addResource(FileUtilConstants.ENV_HADOOP_HOME + FileUtilConstants.REL_PATH_CORE_SITE);
    conf.addResource(FileUtilConstants.ENV_HADOOP_HOME + FileUtilConstants.REL_PATH_HDFS_SITE);

    try {
        FileSystem fs = FileSystem.get(conf);
        Path hdfsfilePath = new Path(hdfsPath);
        logger.info("Filesystem URI : " + fs.getUri());
        logger.info("Filesystem Home Directory : " + fs.getHomeDirectory());
        logger.info("Filesystem Working Directory : " + fs.getWorkingDirectory());
        logger.info("HDFS File Path : " + hdfsfilePath);
        if (!fs.exists(hdfsfilePath)) {
            logger.error("File does not exists : " + hdfsPath);
        }

And here is the command line output from the code.

[root@koversevms ~]# java -jar /tmp/thetus-incendiary-koverse-extension-fileutils-1.0-SNAPSHOT.jar 
13/07/10 02:47:18 INFO fileutils.HadoopFileChecksumUtils: Filesystem URI : file:///
13/07/10 02:47:18 INFO fileutils.HadoopFileChecksumUtils: Filesystem Home Directory : file:/root
13/07/10 02:47:18 INFO fileutils.HadoopFileChecksumUtils: Filesystem Working Directory : file:/root
13/07/10 02:47:18 INFO fileutils.HadoopFileChecksumUtils: HDFS File Path : /usr/hadoop/sample/sample.txt
13/07/10 02:47:18 ERROR fileutils.HadoopFileChecksumUtils: File does not exists : /usr/hadoop/sample/sample.txt

I am new to hadoop so I don't know what is going wrong.

标签: java hadoop hdfs
1条回答
够拽才男人
2楼-- · 2020-02-26 11:43

Here is code fragment originally posted in context of answer to this question. It should solve your question too despite intention of original question was different. Main point in your code is you have issues starting from scheme (file://). Please check fs.defaultFS variable in your configuration.

package org.myorg;

import java.security.PrivilegedExceptionAction;

import org.apache.hadoop.conf.*;
import org.apache.hadoop.security.UserGroupInformation;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.FileStatus;

public class HdfsTest {

    public static void main(String args[]) {

        try {
            UserGroupInformation ugi
                = UserGroupInformation.createRemoteUser("hbase");

            ugi.doAs(new PrivilegedExceptionAction<Void>() {

                public Void run() throws Exception {

                    Configuration conf = new Configuration();
                    conf.set("fs.defaultFS", "hdfs://1.2.3.4:8020/user/hbase");
                    conf.set("hadoop.job.ugi", "hbase");

                    FileSystem fs = FileSystem.get(conf);

                    fs.createNewFile(new Path("/user/hbase/test"));

                    FileStatus[] status = fs.listStatus(new Path("/user/hbase"));
                    for(int i=0;i<status.length;i++){
                        System.out.println(status[i].getPath());
                    }
                    return null;
                }
            });
        } catch (Exception e) {
            e.printStackTrace();
        }
    }
}
查看更多
登录 后发表回答