Exception while accessing KafkaOffset from RDD

2019-02-25 12:36发布

I have a Spark consumer which streams from Kafka. I am trying to manage offsets for exactly-once semantics.

However, while accessing the offset it throws the following exception:

"java.lang.ClassCastException: org.apache.spark.rdd.MapPartitionsRDD cannot be cast to org.apache.spark.streaming.kafka.HasOffsetRanges"

The part of the code that does this is as below :

var offsetRanges = Array[OffsetRange]()
dataStream
  .transform { 
    rdd =>
      offsetRanges = rdd.asInstanceOf[HasOffsetRanges].offsetRanges
      rdd
   }
   .foreachRDD(rdd => { })

Here dataStream is a direct stream(DStream[String]) created using KafkaUtils API something like :

KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, Set(source_schema+"_"+t)).map(_._2)

If somebody can help me understand what I am doing wrong here. transform is the first method in the chain of methods performed on datastream as mentioned in the official documentation as well

Thanks.

1条回答
一纸荒年 Trace。
2楼-- · 2019-02-25 12:55

Your problem is:

.map(._2)

Which creates a MapPartitionedDStream instead of the DirectKafkaInputDStream created by KafkaUtils.createKafkaStream.

You need to map after transform:

val kafkaStream = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, Set(source_schema+""+t))

kafkaStream
  .transform { 
    rdd => 
      offsetRanges = rdd.asInstanceOf[HasOffsetRanges].offsetRanges
      rdd
  }
  .map(_._2)
  .foreachRDD(rdd => // stuff)
查看更多
登录 后发表回答