Is there the equivalent for a `find` command in `h

2019-05-13 18:14发布

I know that from the terminal, one can do a find command to find files such as :

find . -type d -name "*something*" -maxdepth 4 

But, when I am in the hadoop file system, I have not found a way to do this.

hadoop fs -find ....

throws an error.

How do people traverse files in hadoop? I'm using hadoop 2.6.0-cdh5.4.1.

4条回答
在下西门庆
2楼-- · 2019-05-13 18:45

If you are using the Cloudera stack, try the find tool:

org.apache.solr.hadoop.HdfsFindTool

Set the command to a bash variable:

COMMAND='hadoop jar /opt/cloudera/parcels/CDH/lib/solr/contrib/mr/search-mr-job.jar org.apache.solr.hadoop.HdfsFindTool'

Usage as follows:

${COMMAND} -find . -name "something" -type d ...
查看更多
再贱就再见
3楼-- · 2019-05-13 18:50

It you don't have the cloudera parcels available you can use awk.

hdfs dfs -ls -R /some_path | awk -F / '/^d/ && (NF <= 5) && /something/' 

that's almost equivalent to the find . -type d -name "*something*" -maxdepth 4 command.

查看更多
我想做一个坏孩纸
4楼-- · 2019-05-13 18:55

hadoop fs -find was introduced in Apache Hadoop 2.7.0. Most likely you're using an older version hence you don't have it yet. see: HADOOP-8989 for more information.

In the meantime you can use

hdfs dfs -ls -R <pattern>

e.g,: hdfs dfs -ls -R /demo/order*.*

but that's not as powerful as 'find' of course and lacks some basics. From what I understand people have been writing scripts around it to get over this problem.

查看更多
贼婆χ
5楼-- · 2019-05-13 18:55

adding HdfsFindTool as alias in .bash_profile,will make it easy to use always.

--add below to profile alias hdfsfind='hadoop jar /opt/cloudera/parcels/CDH/lib/solr/contrib/mr/search-mr-job.jar org.apache.solr.hadoop.HdfsFindTool' alias hdfs='hadoop fs'

--u can use as follows now :(here me using find tool to get HDFS source folder wise File name and record counts.)

$> cnt=1;for ff in hdfsfind -find /dev/abc/*/2018/02/16/*.csv -type f; do pp=echo ${ff}|awk -F"/" '{print $7}';fn=basename ${ff}; fcnt=hdfs -cat ${ff}|wc -l; echo "${cnt}=${pp}=${fn}=${fcnt}"; cnt=expr ${cnt} + 1; done

--simple to get folder /file details: $> hdfsfind -find /dev/abc/ -type f -name "*.csv" $> hdfsfind -find /dev/abc/ -type d -name "toys"

查看更多
登录 后发表回答