What are the steps to connect spark to hbase?
I have the master addresses for both. Do I just add the hbase address into spark classpath?
What are the steps to connect spark to hbase?
I have the master addresses for both. Do I just add the hbase address into spark classpath?
This post about connecting Spark With HBase should be helpful: http://www.vidyasource.com/blog/Programming/Scala/Java/Data/Hadoop/Analytics/2014/01/25/lighting-a-spark-with-hbase
Do I just add the hbase address into spark classpath?
No. Actually, you should put the hbase configuraion files in the spark classpath. If not, you should set them in your codes, such as:
Configuration hConf = HBaseConfiguration.create(conf);
hConf.set("hbase.zookeeper.quorum", "PDHadoop1.corp.CompanyName.com,PDHadoop2.corp.CompanyName.com");
hConf.setInt("hbase.zookeeper.property.clientPort", 10000);