I'm a newbie on Scala, am trying to use Spark to read from a mysql database. I'm facing a class-not-found exception whatever I do. I tried to connect without Spark, using Squeryl, Scalike, etc. Always the same problem. Here's one example I tried :
logger.info("Write part")
val dataframe_mysql = spark.sqlContext
.read.format("jdbc")
.option("url", s"jdbc:mysql://${datamart_server}:3306/vol")
.option("driver", "com.mysql.jdbc.Driver")
.option("dbtable", "company")
.option("user", datamart_user).option("password", datamart_pwd)
.load()
dataframe_mysql.show()
I tried to put the driver classname in a src/main/resources/application.conf:
db.default.driver="com.mysql.jdbc.Driver"
But it didn't help. I've got :
java.sql.SQLException: No suitable driver
I also share the sbt file to show how I add the dependencies :
name := "commercial-api-datamart-feed"
version := "0.1"
scalaVersion := "2.11.6"
libraryDependencies += "org.scala-lang.modules" %% "scala-parser-combinators" % "1.1.0"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.3" % Runtime
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.9.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.24" % Runtime
Spark is not mandatory but I think it's better for performance.