I have a following problem, my main method is:
static public void main(String args[]){
SparkConf conf = new SparkConf().setAppName("TestHive");
SparkContext sc = new org.apache.spark.SparkContext(conf);
HiveContext hiveContext = new org.apache.spark.sql.hive.HiveContext(sc);
}
And I build it with mvn package
Then I submit my code, however I get following exception. I have no idea what's wrong:
sh spark-submit --class "TestHive" --master local[4] ~/target/test-1.0-SNAPSHOT-jar-with-dependencies.jar
Exception in thread "main" java.lang.NoSuchMethodException: org.apache.hadoop.hive.conf.HiveConf.getTimeVar(org.apache.hadoop.hive.conf.HiveConf$ConfVars, java.util.concurrent.TimeUnit)
Tell me please, where I am wrong.
PS I built my spark with hive and thriftServer.
Spark 1.5.2 built for Hadoop 2.4.0
Build flags: -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
It seems to be version conflict between spark components (spark-core, spark-sql and spark-hive)
To avoid this conflit all versions of those components should be the same. You could do that in your
pom.xml
by setting a peroperty called spark.version for example: