Viewing the number of blocks for a file in hadoop

2019-01-23 04:35发布

How can I view how many blocks has a file been broken into, in a Hadoop file system?

标签: hadoop hdfs
3条回答
Fickle 薄情
2楼-- · 2019-01-23 05:02

This should work..

hadoop fs -stat "%o" /path/to/file
查看更多
等我变得足够好
3楼-- · 2019-01-23 05:10

We can use hadoop file system check command to know the blocks for the specific file.

Below is the command:

hadoop fsck [path] [options]

To view the blocks for the specific file :

hadoop fsck /path/to/file -files -blocks
查看更多
爱情/是我丢掉的垃圾
4楼-- · 2019-01-23 05:20

hadoop fsck filetopath

used the above commad in CDH 5. Got the below Error.

hadoop-hdfs/bin/hdfs: line 262: exec: : not found

Use the below command and it worked good

hdfs fsck filetopath

查看更多
登录 后发表回答