I've installed Hadoop, Spark, R, Rstudio-server and SparkR, and I'm now trying to install Hive.
Following tutorials on the internet, here's what I did :
$ cd /home/francois-ubuntu/media/
$ mkdir install-hive
$ cd install-hive
$ wget http://mirrors.ircam.fr/pub/apache/hive/hive-2.1.0/apache-hive-2.1.0-bin.tar.gz
$ tar -xzvf apache-hive-2.1.0-bin.tar.gz
$ mkdir /usr/lib/hive
$ mv apache-hive-2.1.0-bin /usr/lib/hive
$ cd
$ rm -rf /home/francois-ubuntu/media/install-hive
$ sudo vim ~/.bashrc
In .bashrc
, I wrote the following (I'm also including the lines relative to Java, Hadoop and Spark, maybe it can be helpful) :
# Set JAVA_HOME
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
# Set HADOOP_HOME
alias hadoop=/usr/local/hadoop/bin/hadoop
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin
# Set SPARK_HOME
export SPARK_HOME=/usr/local/spark
# Set HIVE_HOME
export HIVE_HOME=/usr/lib/hive/apache-hive-2.1.0-bin
PATH=$PATH:$HIVE_HOME/bin
export PATH
Back to the CLI :
$ cd /usr/lib/hive/apache-hive-2.1.0-bin/bin
$ sudo vim hive-config.sh
In hive-config.sh, I add :
export HADOOP_HOME=/usr/local/hadoop
Then :wq
, back to the CLI :
$ hadoop fs -mkdir /usr/hive/warehouse
$ hadoop fs -chmod g+w /usr/hive/warehouse
And then finally :
$ hive
Here is what I get :
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/hive-common-2.1.0.jar!/hive-log4j2.properties Async: true
Mon Jul 18 12:13:44 CEST 2016 Thread[main,5,main] java.io.FileNotFoundException: derby.log (Permission denied)
----------------------------------------------------------------
Mon Jul 18 12:13:45 CEST 2016:
Booting Derby (version The Apache Software Foundation - Apache Derby - 10.10.2.0 - (1582446)) instance a816c00e-0155-fd7f-479a-0000040c9aa0
on database directory /usr/lib/hive/apache-hive-2.1.0-bin/bin/metastore_db in READ ONLY mode with class loader sun.misc.Launcher$AppClassLoader@2e5c649.
Loaded from file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/derby-10.10.2.0.jar.
java.vendor=Oracle Corporation
java.runtime.version=1.8.0_91-8u91-b14-0ubuntu4~16.04.1-b14
user.dir=/usr/lib/hive/apache-hive-2.1.0-bin/bin
os.name=Linux
os.arch=amd64
os.version=4.4.0-28-generic
derby.system.home=null
Database Class Loader started - derby.database.classpath=''
And then... nothing, it stops there. According to the tutorials, I should have the hive prompt (hive>
) at this point, but I don't, I tried some hive commands, they don't work. I don't have the classic CLI prompt either, no prompt, I can type stuff but I can't execute anything. It seems the only thing I can do is stop it with CTRL+C.
Any idea what's wrong ?
Thanks.
Edit 1 :
Following this advice from @Hawknight, I followed the help given here, and did the following :
sudo addgroup hive
sudo useradd -g hive hive
sudo adduser hive sudo
sudo mkdir /home/hive
sudo chown -R hive:hive /home/hive
sudo chown -R hive:hive /usr/lib/hive/
visudo
Added this line to sudoers file:
hive ALL=(ALL) NOPASSWD:ALL
And then, back to CLI :
sudo su hive
hive
I still get the same problem, though.
Edit 2 :
Followed the advice from here, I now get a different error. The error output is very long, I feel like it might not be useful to copy everything since the other errors probably originate from the first one, so here is the beginning :
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/hive-common-2.1.0.jar!/hive-log4j2.properties Async: true
Mon Jul 18 18:03:44 CEST 2016 Thread[main,5,main] java.io.FileNotFoundException: derby.log (Permission denied)
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:578)
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:518)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)
at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)
at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)
at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:545)
... 9 more
Please tell me if you want the rest of the error log.