I have a problem with configuring and installing hbase/hadoop/hive. What I did so far on an VM with ubuntu 14.04.3 LTS:
- installed jdk like this with the Version jdk1.8.0_60:
https://askubuntu.com/questions/56104/how%E2%80%8B-can-i-install-sun-o%E2%80%8Bracles-proprietary-j%E2%80%8Bava-jdk-6-7-8-or-jre%E2%80%8B
- Got hadoop-2.6.1 and unpacked the .tar file. after that i did some configuration:
core site.xml:
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
hadoop-env.sh
export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_60
hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:///home/hfu/hadoop/hdfs/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:///home/hfu/hadoop/hdfs/datanode</value>
</property>
</configuration>
- got hbase-0.98.0-hadoop2 and unpacked it and config it the following way:
hbase-env.sh
export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_60/
hbase-site.xml
<configuration>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.rootdir</name>
<value>file:///home/hfu/hbase-0.98.0-hadoop2/data</value>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>/home/hfu/hbase-0.98.0-hadoop2/zookeeper</value>
</property>
<property>
<name>zookeeper.znode.parent</name>
<value>/hbase-unsecure</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<value>ubuntu</value>
</property>
<property>
<name>hbase.master</name>
<value>ubuntu:16000</value>
</property>
<property>
<name>hbase.zookeeper.property.clientPort</name>
<value>2181</value>
</property>
</configuration>
- got apache-hive-1.2.1-bin and unpacked it :
hive-enf.sh
export HADOOP_HOME=/home/hfu/hadoop-2.6.1
start hadoop:
sbin/start-all.sh
start hbase:
bin/start-hbase.sh
In HBASE shell is creating of a table possible. I can also put some entries into it. I also write this in console before I start hive
export HADOOP_USER_CLASSPATH_FIRST=true
to prevent an Exception
In Hive its possible to create a table and read content out of that. but as soon as I want to connect both together as described in some tutorials
http://chase-seibert.github.io/blog/2013/05/10/hive-hbase-quickstart.html
or http://www.n10k.com/blog/hbase-via-hive-pt1/
I get an Exception. I also described my problem earlier in another post, but this one is much more detailed: How transfer a Table from HBase to Hive?