What do I need to import to make `SparkConf` resol

2019-09-10 05:25发布

I am setting up a Java Spark application and am following the Datastax documentation on getting started with the Java API. I've added

<dependencies>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector-java_2.10</artifactId>
        <version>1.1.1</version>
    </dependency>
    ...
</dependencies>

and (a previously installed dse.jar to my local Maven repository)

<dependency>
    <groupId>com.datastax</groupId>
    <artifactId>dse</artifactId>
    <version>version number</version>
</dependency>

. Next step in the guide is to do

SparkConf conf = DseSparkConfHelper.enrichSparkConf(new SparkConf())
                .setAppName( "My application");
DseSparkContext sc = new DseSparkContext(conf);

. However, the class SparkConf can't be resolved. Should it? Am I missing some additional Maven dependency? Which?

1条回答
Summer. ? 凉城
2楼-- · 2019-09-10 06:17

The class is org.apache.spark.SparkConf which is in the spark-core_scala version artifact.

So your pom.xml might look like this:

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.4.1</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector-java_2.10</artifactId>
        <version>1.5.0-M2</version>
    </dependency>
    <dependency>
        <groupId>com.datastax</groupId>
        <artifactId>dse</artifactId>
        <version>*version number*</version>
    </dependency>
</dependencies>

The spark-core JAR is also located in: dse_install/resources/spark/lib/spark_core_2.10-version.jar (tarball) or: /usr/share/dse/spark/lib/spark_core_2.10-version.jar (package installs)

查看更多
登录 后发表回答