Fail to create SparkContext

2019-04-25 20:24发布

问题:

I'm testing Spark in spark-shell with scala code. I'm building up the prototype to use Kafka and Spark.

I ran the spark-shell like below.

spark-shell --jars ~/spark/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.3.1.jar

And I ran the code below in the shell.

import kafka.serializer.StringDecoder
import org.apache.spark.streaming._
import org.apache.spark.streaming.kafka._
import org.apache.spark.SparkConf


// Create context with 2 second batch interval
val sparkConf = new SparkConf().setAppName("DirectKafkaWordCount")
val ssc = new StreamingContext(sparkConf, Seconds(2) )

Then I found the error when I create ssc. spark-shell told me message like below.

scala> val ssc = new StreamingContext(sparkConf, Seconds(2) )
15/06/05 09:06:08 INFO SparkContext: Running Spark version 1.3.1
15/06/05 09:06:08 INFO SecurityManager: Changing view acls to: vagrant
15/06/05 09:06:08 INFO SecurityManager: Changing modify acls to: vagrant
15/06/05 09:06:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(vagrant); users with modify permissions: Set(vagrant)
15/06/05 09:06:08 INFO Slf4jLogger: Slf4jLogger started
15/06/05 09:06:08 INFO Remoting: Starting remoting
15/06/05 09:06:08 INFO Utils: Successfully started service 'sparkDriver' on port 51270.
15/06/05 09:06:08 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:51270]
15/06/05 09:06:08 INFO SparkEnv: Registering MapOutputTracker
15/06/05 09:06:08 INFO SparkEnv: Registering BlockManagerMaster
15/06/05 09:06:08 INFO DiskBlockManager: Created local directory at /tmp/spark-d3349ba2-125b-4dda-83fa-abfa6c692143/blockmgr-c0e59bba-c4df-423f-b147-ac55d9bd5ccf
15/06/05 09:06:08 INFO MemoryStore: MemoryStore started with capacity 267.3 MB
15/06/05 09:06:08 INFO HttpFileServer: HTTP File server directory is /tmp/spark-842c15d5-7e3f-49c8-a4d0-95bdf5c6b049/httpd-26f5e751-8406-4a97-9ed3-aa79fc46bc6e
15/06/05 09:06:08 INFO HttpServer: Starting HTTP Server
15/06/05 09:06:08 INFO Server: jetty-8.y.z-SNAPSHOT
15/06/05 09:06:08 INFO AbstractConnector: Started SocketConnector@0.0.0.0:55697
15/06/05 09:06:08 INFO Utils: Successfully started service 'HTTP file server' on port 55697.
15/06/05 09:06:08 INFO SparkEnv: Registering OutputCommitCoordinator
15/06/05 09:06:08 INFO Server: jetty-8.y.z-SNAPSHOT
15/06/05 09:06:08 WARN AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:444)
        at sun.nio.ch.Net.bind(Net.java:436)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
        at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
        at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
        at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.spark-project.jetty.server.Server.doStart(Server.java:293)
        at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199)
        at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209)
        at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209)
        at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)
        at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:120)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:309)
        at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643)
        at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
        at $line35.$read$$iwC$$iwC$$iwC.<init>(<console>:50)
        at $line35.$read$$iwC$$iwC.<init>(<console>:52)
        at $line35.$read$$iwC.<init>(<console>:54)
        at $line35.$read.<init>(<console>:56)
        at $line35.$read$.<init>(<console>:60)
        at $line35.$read$.<clinit>(<console>)
        at $line35.$eval$.<init>(<console>:7)
        at $line35.$eval$.<clinit>(<console>)
        at $line35.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/06/05 09:06:08 WARN AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@e067ac3: java.net.BindException: Address already in use
java.net.BindException: Address already in use
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:444)
        at sun.nio.ch.Net.bind(Net.java:436)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
        at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
        at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
        at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.spark-project.jetty.server.Server.doStart(Server.java:293)
        at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199)
        at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209)
        at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209)
        at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)
        at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:120)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:309)
        at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643)
        at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
        at $line35.$read$$iwC$$iwC$$iwC.<init>(<console>:50)
        at $line35.$read$$iwC$$iwC.<init>(<console>:52)
        at $line35.$read$$iwC.<init>(<console>:54)
        at $line35.$read.<init>(<console>:56)
        at $line35.$read$.<init>(<console>:60)
        at $line35.$read$.<clinit>(<console>)
        at $line35.$eval$.<init>(<console>:7)
        at $line35.$eval$.<clinit>(<console>)
        at $line35.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
15/06/05 09:06:08 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
15/06/05 09:06:08 INFO Server: jetty-8.y.z-SNAPSHOT
15/06/05 09:06:08 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4041
15/06/05 09:06:08 INFO Utils: Successfully started service 'SparkUI' on port 4041.
15/06/05 09:06:08 INFO SparkUI: Started SparkUI at http://localhost:4041
15/06/05 09:06:08 INFO SparkContext: Added JAR file:/home/vagrant/spark/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.3.1.jar at http://10.0.2.15:55697/jars/spark-streaming-kafka-assembly_2.10-1.3.1.jar with timestamp 1433495168735
15/06/05 09:06:08 INFO Executor: Starting executor ID <driver> on host localhost
15/06/05 09:06:08 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:51270/user/HeartbeatReceiver
15/06/05 09:06:08 INFO NettyBlockTransferService: Server created on 37393
15/06/05 09:06:08 INFO BlockManagerMaster: Trying to register BlockManager
15/06/05 09:06:08 INFO BlockManagerMasterActor: Registering block manager localhost:37393 with 267.3 MB RAM, BlockManagerId(<driver>, localhost, 37393)
15/06/05 09:06:08 INFO BlockManagerMaster: Registered BlockManager
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:80)
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1016)
$iwC$$iwC.<init>(<console>:9)
$iwC.<init>(<console>:18)
<init>(<console>:20)
.<init>(<console>:24)
.<clinit>(<console>)
.<init>(<console>:7)
.<clinit>(<console>)
$print(<console>)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:606)
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
        at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1812)
        at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1808)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1808)
        at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1795)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:1795)
        at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:1847)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:1754)
        at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643)
        at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
        at $iwC$$iwC$$iwC.<init>(<console>:50)
        at $iwC$$iwC.<init>(<console>:52)
        at $iwC.<init>(<console>:54)
        at <init>(<console>:56)
        at .<init>(<console>:60)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I wonder why StreamingContext makes the error. Could you unveil the problem?

I also checked 4040 port.

This is the opened port list before running spark-shell.

vagrant@vagrant-ubuntu-trusty-64:~$ netstat -an | grep "LISTEN "
tcp        0      0 0.0.0.0:22              0.0.0.0:*               LISTEN
tcp        0      0 0.0.0.0:47078           0.0.0.0:*               LISTEN
tcp        0      0 0.0.0.0:111             0.0.0.0:*               LISTEN
tcp6       0      0 :::22                   :::*                    LISTEN
tcp6       0      0 :::44461                :::*                    LISTEN
tcp6       0      0 :::111                  :::*                    LISTEN
tcp6       0      0 :::80                   :::*                    LISTEN

And this is the opened port list after running spark-shell.

vagrant@vagrant-ubuntu-trusty-64:~$ netstat -an | grep "LISTEN "
tcp        0      0 0.0.0.0:22              0.0.0.0:*               LISTEN
tcp        0      0 0.0.0.0:47078           0.0.0.0:*               LISTEN
tcp        0      0 0.0.0.0:111             0.0.0.0:*               LISTEN
tcp6       0      0 :::22                   :::*                    LISTEN
tcp6       0      0 :::55233                :::*                    LISTEN
tcp6       0      0 :::4040                 :::*                    LISTEN
tcp6       0      0 10.0.2.15:41545         :::*                    LISTEN
tcp6       0      0 :::44461                :::*                    LISTEN
tcp6       0      0 :::111                  :::*                    LISTEN
tcp6       0      0 :::56784                :::*                    LISTEN
tcp6       0      0 :::80                   :::*                    LISTEN
tcp6       0      0 :::39602                :::*                    LISTEN

回答1:

A default SparkContext 'sc' is created when you start the spark-shell. The constructor method that you are using tries to create another instance of SparkContext which isn't what you should do. What you should really be doing is use the existing sparkContext to construct the StreamingContext using the overloaded constructor

new StreamingContext(sparkContext: SparkContext, batchDuration: Duration) 

So now your code should look like this,

// Set the existing SparkContext's Master, AppName and other params
sc.getConf.setMaster("local[2]").setAppName("NetworkWordCount").set("spark.ui.port", "44040" )
// Use 'sc' to create a Streaming context with 2 second batch interval
val ssc = new StreamingContext(sc, Seconds(2) )


回答2:

You can change Spark UI port by using a property in the Spark config:

spark.ui.port=44040


回答3:

If you launch `spark-shell', basically one spark context, sc, is running. If you need to create new spark context for streaming then you need to use another port except 4040 because it's allocated by 1st spark context.

So finally, I wrote the code like below to create another spark context for streaming process.

import kafka.serializer.StringDecoder
import org.apache.spark.streaming._
import org.apache.spark.streaming.kafka._
import org.apache.spark.SparkConf


// Create context with 2 second batch interval
val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount").set("spark.ui.port", "44040" ).set("spark.driver.allowMultipleContexts", "true")
val ssc = new StreamingContext(conf, Seconds(2) )
.... 

Thank you for everybody who suggests the solution. ;-)



回答4:

I came here looking for this answer: I was trying to connect to cassandra through spark shell. Since there is a sparkContext sc running by default, I was getting an error:

Service 'SparkUI' could not bind on port 4040. Attempting port 4041.

All I had to do was:

sc.stop

[I know this doesn't answer the question above. But this seems to be the only question on stackoverflow that comes up on search and others might find it useful]



回答5:

Maybe not the same case but I had similar warning like "WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041." I restarted the machine then it's ok. I started the spark-shell and saw scala>



回答6:

I had faced same issue while starting spark-shell . I resolve it by below procedure , first i go towards spark/sbin directory , then i had started spark session by these command ,

 ./start-all.sh 

or you can use ./start-master.sh and ./start-slave.sh for the same . Now if you will run spark-shell or pyspark or any other spark component then it will automatically create spark context object sc for you .