How to configure hbase in spark?

2020-07-13 10:41发布

What are the steps to connect spark to hbase?

I have the master addresses for both. Do I just add the hbase address into spark classpath?

1条回答
爱情/是我丢掉的垃圾
2楼-- · 2020-07-13 11:20

This post about connecting Spark With HBase should be helpful: http://www.vidyasource.com/blog/Programming/Scala/Java/Data/Hadoop/Analytics/2014/01/25/lighting-a-spark-with-hbase

Do I just add the hbase address into spark classpath?

No. Actually, you should put the hbase configuraion files in the spark classpath. If not, you should set them in your codes, such as:

    Configuration hConf = HBaseConfiguration.create(conf);
    hConf.set("hbase.zookeeper.quorum", "PDHadoop1.corp.CompanyName.com,PDHadoop2.corp.CompanyName.com");
    hConf.setInt("hbase.zookeeper.property.clientPort", 10000);
查看更多
登录 后发表回答