The way to check a HDFS directory's size?

2019-03-08 17:55发布

I know du -sh in common Linux filesystems. But how to do that with HDFS?

9条回答
你好瞎i
2楼-- · 2019-03-08 18:22

hadoop version 2.3.33:

hadoop fs -dus  /path/to/dir  |   awk '{print $2/1024**3 " G"}' 

enter image description here

查看更多
祖国的老花朵
3楼-- · 2019-03-08 18:26

To get the size of the directory hdfs dfs -du -s -h /$yourDirectoryName can be used. hdfs dfsadmin -report can be used to see a quick cluster level storage report.

查看更多
Bombasti
4楼-- · 2019-03-08 18:28

Command Should be hadoop fs -du -s -h \dirPath

  • -du [-s] [-h] ... : Show the amount of space, in bytes, used by the files that match the specified file pattern.

  • -s : Rather than showing the size of each individual file that matches the
    pattern, shows the total (summary) size.

  • -h : Formats the sizes of files in a human-readable fashion rather than a number of bytes. (Ex MB/GB/TB etc)

    Note that, even without the -s option, this only shows size summaries one level deep into a directory.

    The output is in the form size name(full path)

查看更多
登录 后发表回答