Find port number where HDFS is listening

2019-03-09 09:39发布

I want to access hdfs with fully qualified names such as :

hadoop fs -ls hdfs://machine-name:8020/user

I could also simply access hdfs with

hadoop fs -ls /user

However, I am writing test cases that should work on different distributions(HDP, Cloudera, MapR...etc) which involves accessing hdfs files with qualified names.

I understand that hdfs://machine-name:8020 is defined in core-site.xml as fs.default.name. But this seems to be different on different distributions. For example, hdfs is maprfs on MapR. IBM BigInsights don't even have core-site.xml in $HADOOP_HOME/conf.

There doesn't seem to a way hadoop tells me what's defined in fs.default.name with it's command line options.

How can I get the value defined in fs.default.name reliably from command line?

The test will always be running on namenode, so machine name is easy. But getting the port number(8020) is a bit difficult. I tried lsof, netstat.. but still couldn't find a reliable way.

5条回答
爱情/是我丢掉的垃圾
2楼-- · 2019-03-09 09:59

fs.default.name is deprecated.

use : hdfs getconf -confKey fs.defaultFS

查看更多
霸刀☆藐视天下
3楼-- · 2019-03-09 10:03

you can use

hdfs getconf -confKey fs.default.name
查看更多
【Aperson】
4楼-- · 2019-03-09 10:04

I encountered this answer when I was looking for HDFS URI. Generally that's a URL pointing to the namenode. While hdfs getconf -confKey fs.defaultFS gets me the name of the nameservice but it won't help me building the HDFS URI.

I tried the command below to get a list of the namenodes instead

 hdfs getconf -namenodes

This gave me a list of all the namenodes, primary first followed by secondary. After that constructing the HDFS URI was simple

hdfs://<primarynamenode>/
查看更多
一纸荒年 Trace。
5楼-- · 2019-03-09 10:12

Below command available in Apache hadoop 2.7.0 onwards, this can be used for getting the values for the hadoop configuration properties. fs.default.name is deprecated in hadoop 2.0, fs.defaultFS is the updated value. Not sure whether this will work incase of maprfs.

hdfs getconf -confKey fs.defaultFS  # ( new property ) 

or

hdfs getconf -confKey fs.default.name    # ( old property ) 

Not sure whether there is any command line utilities available for retrieving configuration properties values in Mapr or hadoop 0.20 hadoop versions. In case of this situation you better try the same in Java for retrieving the value corresponding to a configuration property.

Configuration hadoop conf = Configuration.getConf();
System.out.println(conf.get("fs.default.name"));
查看更多
爷的心禁止访问
6楼-- · 2019-03-09 10:14

Yes, hdfs getconf -namenodes will show list of namenodes.

查看更多
登录 后发表回答