Read HBase table with where clause using Spark

2019-07-31 01:13发布

问题:

I am trying to read a HBase table using Spark Scala API.

Sample Code:

conf.set("hbase.master", "localhost:60000")
conf.set("hbase.zookeeper.quorum", "localhost")
conf.set(TableInputFormat.INPUT_TABLE, tableName)
val hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat], classOf[ImmutableBytesWritable], classOf[Result])
println("Number of Records found : " + hBaseRDD.count())

How to add where clause if i use newAPIHadoopRDD ?

Or we need to use any Spark Hbase Connector to achieve this?

I saw the below Spark Hbase connector, but i don't see any example code with where clause.

https://github.com/nerdammer/spark-hbase-connector

回答1:

You can use SHC connector from HortonWorks to achieve this.

https://github.com/hortonworks-spark/shc

Here is a code example with Spark 2.

 val catalog =
        s"""{
            |"table":{"namespace":"default", "name":"my_table"},
            |"rowkey":"id",
            |"columns":{
            |"id":{"cf":"rowkey", "col":"id", "type":"string"},
            |"name":{"cf":"info", "col":"name", "type":"string"},
            |"age":{"cf":"info", "col":"age", "type":"string"}
            |}
            |}""".stripMargin

    val spark = SparkSession
        .builder()
        .appName("hbase spark")
        .getOrCreate()

    val df = spark
        .read
        .options(
            Map(
                HBaseTableCatalog.tableCatalog -> catalog
            )
        )
        .format("org.apache.spark.sql.execution.datasources.hbase")
        .load()

    df.show()

You can then use whatever method on your dataframe. Ex :

df.where(df("age") === 20)