Can I run spark unit tests within eclipse

2019-08-11 00:14发布

问题:

Recently we moved from using scalding to spark. I used eclipse and the scala IDE for eclipse to write code and tests. The tests ran fine with twitter's JobTest class. Any class using JobTest would be automatically available to run as a scala unit test within eclipse. I've lost that ability now. The spark test cases are perfectly runnable using sbt, but the run configuration in eclipse for these tests lists 'none applicable'.

Is there a way to run spark unit tests within eclipse?

回答1:

I think this same approach using Java would work in Scala. Basically just make a SparkContext using the master as "local" and then build and run unit tests as normal. Be sure to stop the SparkContext when the test is finished.

I have this working for Spark 1.0.0 but not a newer version.

public class Test123 {
  static JavaSparkContext sparkCtx;

  @BeforeClass
  public static void sparkSetup() {
    // Setup Spark
    SparkConf conf = new SparkConf();
    sparkCtx = new JavaSparkContext("local", "test", conf);     
  }

  @AfterClass
  public static void sparkTeardown() {
    sparkCtx.stop();
  }

  @Test
  public void integrationTest() {
    JavaRDD<String> logRawInput = sparkCtx.parallelize(Arrays.asList(new String[] {
            "data1",
            "data2",
            "garbage",
            "data3",
        }));
  }
}