How can I view how many blocks has a file been broken into, in a Hadoop file system?
相关问题
- Spark on Yarn Container Failure
- enableHiveSupport throws error in java spark code
- spark select and add columns with alias
- Unable to generate jar file for Hadoop
-
hive: cast array
> into map
相关文章
- Java写文件至HDFS失败
- mapreduce count example
- Could you give me any clue Why 'Cannot call me
- Hive error: parseexception missing EOF
- Exception in thread “main” java.lang.NoClassDefFou
- ClassNotFoundException: org.apache.spark.SparkConf
- How can I configure the maven shade plugin to incl
- How was the container created and how does it work
This should work..
We can use hadoop file system check command to know the blocks for the specific file.
Below is the command:
To view the blocks for the specific file :
hadoop fsck filetopath
used the above commad in CDH 5. Got the below Error.
hadoop-hdfs/bin/hdfs: line 262: exec: : not found
Use the below command and it worked good
hdfs fsck filetopath