Spark : how to run spark file from spark shell

2019-01-10 06:00发布


I am using CDH 5.2. I am able to use spark-shell to run the commands.

  1. How can I run the file(file.spark) which contain spark commands.
  2. Is there any way to run/compile the scala programs in CDH 5.2 without sbt?

Thanks in advance

4条回答
走好不送
2楼-- · 2019-01-10 06:24

To load an external file from spark-shell simply do

:load PATH_TO_FILE

This will call everything in your file.

I don't have a solution for your SBT question though sorry :-)

查看更多
smile是对你的礼貌
3楼-- · 2019-01-10 06:36

Just to give more perspective to the answers

Spark-shell is a scala repl

You can type :help to see the list of operation that are possible inside the scala shell

scala> :help
All commands can be abbreviated, e.g., :he instead of :help.
:edit <id>|<line>        edit history
:help [command]          print this summary or command-specific help
:history [num]           show the history (optional num is commands to show)
:h? <string>             search the history
:imports [name name ...] show import history, identifying sources of names
:implicits [-v]          show the implicits in scope
:javap <path|class>      disassemble a file or class name
:line <id>|<line>        place line(s) at the end of history
:load <path>             interpret lines in a file
:paste [-raw] [path]     enter paste mode or paste a file
:power                   enable power user mode
:quit                    exit the interpreter
:replay [options]        reset the repl and replay all previous commands
:require <path>          add a jar to the classpath
:reset [options]         reset the repl to its initial state, forgetting all session entries
:save <path>             save replayable session to a file
:sh <command line>       run a shell command (result is implicitly => List[String])
:settings <options>      update compiler options, if possible; see reset
:silent                  disable/enable automatic printing of results
:type [-v] <expr>        display the type of an expression without evaluating it
:kind [-v] <expr>        display the kind of expression's type
:warnings                show the suppressed warnings from the most recent line which had any

:load interpret lines in a file

查看更多
Viruses.
4楼-- · 2019-01-10 06:44

You can use either sbt or maven to compile spark programs. Simply add the spark as dependency to maven

<repository>
      <id>Spark repository</id>
      <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
</repository>

And then the dependency:

<dependency>
      <groupId>spark</groupId>
      <artifactId>spark</artifactId>
      <version>1.2.0</version>
</dependency>

In terms of running a file with spark commands: you can simply do this:

echo"
   import org.apache.spark.sql.*
   ssc = new SQLContext(sc)
   ssc.sql("select * from mytable").collect
" > spark.input

Now run the commands script:

cat spark.input | spark-shell
查看更多
干净又极端
5楼-- · 2019-01-10 06:46

In command line, you can use

spark-shell -i file.scala

to run code which is written in file.scala

查看更多
登录 后发表回答