sbt unresolved dependency for spark-cassandra-conn

2020-05-07 02:38发布

问题:

build.sbt:

val sparkVersion = "2.1.1";

libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided";

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % sparkVersion;

output:

[error] (myproject/*:update) sbt.ResolveException: unresolved dependency: com.datastax.spark#spark-cassandra-connector;2.0.2: not found

Any idea? I am new to sbt and spark. Thanks

回答1:

This is caused by "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2"; without scala version, see maven repo:

http://search.maven.org/#artifactdetails%7Ccom.datastax.spark%7Cspark-cassandra-connector_2.11%7C2.0.2%7Cjar

There are 2 solutions for this:

  1. "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.2" explicitly set Scala version for dependency
  2. "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.2", use %% with artifact id, this way, the SBT will auto base on your project's scala version to expand to the solution 1.