Apache SPARK:-Nullpointer Exception on broadcast v

2019-06-24 02:02发布

I have a simple spark application, where I am trying to broadcast a String type variable on YARN Cluster. But every time I am trying to access the broadcast-ed variable value , I am getting null within the Task. It will be really helpful, if you guys can suggest, what I am doing wrong here. My code is like follows:-

public class TestApp implements Serializable { 
  static Broadcast<String[]> mongoConnectionString; 

  public static void main( String[] args ) { 
    String mongoBaseURL = args[0]; 
    SparkConf sparkConf =  new SparkConf().setAppName(Constants.appName); 
    JavaSparkContext javaSparkContext = new JavaSparkContext(sparkConf); 

    mongoConnectionString = javaSparkContext.broadcast(args); 

    JavaSQLContext javaSQLContext = new JavaSQLContext(javaSparkContext); 

    JavaSchemaRDD javaSchemaRDD = javaSQLContext.jsonFile(hdfsBaseURL+Constants.hdfsInputDirectoryPath); 

    if(javaSchemaRDD!=null) { 
      javaSchemaRDD.registerTempTable("LogAction"); 
      javaSchemaRDD.cache(); 
      pageSchemaRDD = javaSQLContext.sql(SqlConstants.getLogActionPage); 
      pageSchemaRDD.foreach(new Test());     
    } 
  } 

  private static class Test implements VoidFunction<Row> { 
    private static final long serialVersionUID = 1L; 

    public void call(Row t) throws Exception { 
      logger.info("mongoConnectionString "+mongoConnectionString.value()); 
    } 
  } 
}

1条回答
2楼-- · 2019-06-24 02:56

This is because your broadcast variable is in class level. And since when the class is initialized in the worker node it will not see the value you assigned in the main method. It will only see a null since the broadcast variable is not initialized to anything. The Solution i found was to pass the broadcast variable to the method when calling the method. This is also the case for Accumulators

查看更多
登录 后发表回答