getting Wrong FS: file while running hive query

2019-08-12 12:17发布

问题:

while running a simple select query on hive I'm getting this weird error

java.lang.IllegalArgumentException: Wrong FS: file://usr/lib/hive/lib/CustomUDFint.jar, expected: file:///
    at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:410)
    at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:56)
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:379)
    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:251)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:207)

on googling I found all the links mentioning that the hive metadata for the the table location is pointing to some wrong place.

Any reason why this is so ? and How I can fix it?

thanks,

回答1:

Please make sure HADOOP_HOME is set to proper value?Which Hadoop release are you using?Try setting NN's location through Hive shell and see if helps :

hive -hiveconf fs.default.name=localhost

Change localhost as per your configuration.



回答2:

Just hit the same problem, in my pom.xml I needed to add the dependency

<dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-hdfs</artifactId>
</dependency>


标签: hadoop hive