How to add third party java jars for use in pyspar

2019-01-22 13:42发布

I have some third party Database client libraries in Java. I want to access them through

java_gateway.py

E.g: to make the client class (not a jdbc driver!) available to the python client via the java gateway:

java_import(gateway.jvm, "org.mydatabase.MyDBClient")

It is not clear where to add the third party libraries to the jvm classpath. I tried to add to compute-classpath.sh but that did nto seem to work: I get

 Py4jError: Trying to call a package

Also, when comparing to Hive: the hive jar files are NOT loaded via compute-classpath.sh so that makes me suspicious. There seems to be some other mechanism happening to set up the jvm side classpath.

5条回答
地球回转人心会变
2楼-- · 2019-01-22 14:08

You could add --jars xxx.jar when using spark-submit

./bin/spark-submit --jars xxx.jar your_spark_script.py

or set the enviroment variable SPARK_CLASSPATH

SPARK_CLASSPATH='/path/xxx.jar:/path/xx2.jar' your_spark_script.py

your_spark_script.py was written by pyspark API

查看更多
我命由我不由天
3楼-- · 2019-01-22 14:15

One more thing you can do is to add the Jar in the pyspark jar folder where pyspark is installed. Usually /python3.6/site-packages/pyspark/jars

Be careful if you are using a virtual environment that the jar needs to go to the pyspark installation in the virtual environment.

This way you can use the jar without sending it in command line or load it in your code.

查看更多
在下西门庆
4楼-- · 2019-01-22 14:22

You could add the path to jar file using Spark configuration at Runtime.

Here is an example :

conf = SparkConf().set("spark.jars", "/path-to-jar/spark-streaming-kafka-0-8-assembly_2.11-2.2.1.jar")

sc = SparkContext( conf=conf)

Refer the document for more information.

查看更多
甜甜的少女心
5楼-- · 2019-01-22 14:26
  1. Extract the downloaded jar file.
  2. Edit system environment variable
    • Add a variable named SPARK_CLASSPATH and set its value to \path\to\the\extracted\jar\file.

Eg: you have extracted the jar file in C drive in folder named sparkts its value should be: C:\sparkts

  1. Restart your cluster
查看更多
甜甜的少女心
6楼-- · 2019-01-22 14:28

You can add external jars as arguments to pyspark

pyspark --jars file1.jar,file2.jar
查看更多
登录 后发表回答