I am setting up a Java Spark application and am following the Datastax documentation on getting started with the Java API. I've added
<dependencies>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.1.1</version>
</dependency>
...
</dependencies>
and (a previously installed dse.jar
to my local Maven repository)
<dependency>
<groupId>com.datastax</groupId>
<artifactId>dse</artifactId>
<version>version number</version>
</dependency>
. Next step in the guide is to do
SparkConf conf = DseSparkConfHelper.enrichSparkConf(new SparkConf())
.setAppName( "My application");
DseSparkContext sc = new DseSparkContext(conf);
. However, the class SparkConf
can't be resolved. Should it? Am I missing some additional Maven dependency? Which?
The class is
org.apache.spark.SparkConf
which is in the spark-core_scala version artifact.So your pom.xml might look like this:
The spark-core JAR is also located in: dse_install/resources/spark/lib/spark_core_2.10-version.jar (tarball) or: /usr/share/dse/spark/lib/spark_core_2.10-version.jar (package installs)