How to collect a single row dataframe and use fiel

2019-08-08 23:02发布

问题:

I'm trying to read a single row from a Hive table and use its fields as constants in the rest of my Spark application.

object IniConstHive extends Serializable {
  val techTbl: DataFrame = spark.table("my_db.my_conf_table").orderBy(col("my_date").desc)
  val firstrow: Row = techTbl.head

  val my_firstfield: Double = firstrow.getAs[java.math.BigDecimal](0).doubleValue
  val my_secondfield: Double = firstrow.getAs[java.math.BigDecimal](1).doubleValue  
}

Unfortunately, I get this exception when I call IniConstHive.my_firstfield:

Caused by: java.util.NoSuchElementException: None.get
        at scala.None$.get(Option.scala:347)
        at scala.None$.get(Option.scala:345)
        at org.apache.spark.storage.BlockInfoManager.releaseAllLocksForTask(BlockInfoManager.scala:343)
        at org.apache.spark.storage.BlockManager.releaseAllLocksForTask(BlockManager.scala:676)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:329)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

Is this the right approach to do this? Is there a more efficient way to acheive the result?

回答1:

It turned out the problem was not in the IniConstHive object but the place where I called it. In fact, I was trying to call IniConstHive.my_firstfield inside a custom UDAF, which runs inside executors. For this reason my application tried to instantiate multiple SparkContext outside the driver.

I solved by calling IniConstHive.my_firstfield in the driver and then passing the result as a parameter in the UDAF constructor.