How to run external jar functions in spark-shell

2020-02-17 03:08发布

问题:

I created a jar package from a project by this file-tree:

build.sbt
src/main
src/main/scala
src/main/scala/Tester.scala
src/main/scala/main.scala

where Tester is a class by a function (name is print()) and main has an object to run that prints "Hi!" (from spark documention) created a jar file by sbt successfully and worked well in spark-submit

now I wanna add it into spark-shell and use Tester class as a class to create objects and ... I added the jar file into spark-default.conf but:

scala> val t = new Tester();
<console>:23: error: not found: type Tester
       val t = new Tester();

回答1:

you can try by providing jars with argument as below

./spark-shell --jars pathOfjarsWithCommaSeprated

Or you can add following configuration in you spark-defaults.conf but remember to remove template from end of spark-defaults

spark.driver.extraClassPath  pathOfJarsWithCommaSeprated


回答2:

If you want to add a .jar to the classpath after you've entered spark-shell, use :require. Like:

scala> :require /path/to/file.jar
Added '/path/to/file.jar' to classpath.