Spark : java.sql.SQLException: No suitable driver

2019-08-30 08:51发布

问题:

In my spark application, I am trying to connect to local Postgres database using following line:

val conn = DriverManager.getConnection("jdbc:postgresql://localhost/postgres", "postgres", "*Qwerty#")

Postgres server is running on port 5432 (default). I have also tried including the port. I have also tried Class.forName("org.postgresql.Driver") but it throws ClassNotFoundException. I have made sure that the driver is in the ClassPath.

I am running spark in the local mode.

But I am getting the above exception.

I have included the jdbc driver via sbt as mentioned here : https://mvnrepository.com/artifact/org.postgresql/postgresql/42.2.2

回答1:

So the problem was executors were not able to access the driver jar.

So passing driver jar using spark.jars configuration property solved it.

Its in the spark documentation here:

Comma-separated list of jars to include on the driver and executor classpaths. Globs are allowed.



回答2:

You can also try out this code:

Properties dbProperties = new Properties();

dbProperties.put("driver", "org.postgresql.Driver");

dbProperties.put("user", "postgres");

dbProperties.put("password", "*Qwerty#");

val conn = DriverManager.getConnection("jdbc:postgresql://localhost:5432/postgresDB",dbProperties);