I have a Spark Streaming job running on our cluster with other jobs(Spark core jobs). I want to use Dynamic Resource Allocation for these jobs including Spark Streaming. According to below JIRA Issue, Dynamic Allocation is not supported Spark Streaming(in 1.6.1 version). But is Fixed in 2.0.0
JIRA link
According to the PDF in this issue, it says there should be a configuration field called
spark.streaming.dynamicAllocation.enabled=true
But I dont see this configuration in the documentation.
Can anyone please confirm,
- Can't I enable dynamic resource allocation for Spark Streaming in 1.6.1 version.
- Is it available in Spark 2.0.0 . If yes, what configuration should be set
(
spark.streaming.dynamicAllocation.enabled=true
or spark.dynamicAllocation.enabled=true
)
Can I enable Dynamic Resource Allocation for Spark Streaming for
1.6.1 version?
Yes, you can enable by set up dynamic allocation to any spark application with spark.dynamicAllocation.enabled=true
But I have few issues with streaming application(mentioned in JIRA)
- Your executors may never be idle since they run something every N seconds
- You should have at least one receiver running always
- The existing heuristics don't take into account length of batch queue
So, they are added new property(spark.streaming.dynamicAllocation.enabled
) in Spark 2.0 for streaming apps alone.
Is it available in Spark 2.0.0 . If yes, what configuration should be
set spark.streaming.dynamicAllocation.enabled or
spark.dynamicAllocation.enabled ?
Must be spark.streaming.dynamicAllocation.enabled
if the application is streaming one, otherwise go head with spark.dynamicAllocation.enabled
Edit: (as per comment on 2017-JAN-05)
This is not documented as of today, But I get this property and implementation at Spark source code. ExecutorAllocationManager.scala
(Unit tests ExecutorAllocationManagerSuite.scala
) class has been included in Spark 2.0 and this implementation is not there in Spark 1.6 and below.