I know that Spark applications can be executed on YARN using spark-submit --master yarn
.
The question is:
is it possible to run a Spark application on yarn using the yarn
command ?
If so, the YARN REST API could be used as interface for running spark and MapReduce applications in a uniform way.
Thanks for the question. As suggested above the AM is a good route to write and submit one's application without invoking spark-submit. The community has built around the
spark-submit
command for YARN with the addition of flags that ease the addition of jars and/or configs etc. that are needed to get the application to execute successfully. Submitting ApplicationsAn alternate solution(could try): You could have the spark job as an action in an Oozie workflow. Oozie Spark Extension Depending on what you wish to achieve, either route looks good. Hope it helps.
Just like all YARN Applications, Spark implements a Client and an ApplicationMaster when deploying on YARN. If you look at the implementation in the Spark repository, you'll have a clue as to how to create your own Client/ApplicationMaster : https://github.com/apache/spark/tree/master/yarn/src/main/scala/org/apache/spark/deploy/yarn . But out of the box it does not seem possible.
I have not seen the lates package, but few months back such thing was not possible "out of the box" (this is info straight from cloudera support). I know it's not what you were hoping for, but that's what I know.
I see this question is a year old, but to anyone else who stumbles across this question it looks like this should be possible now. I've been trying to do something similar and have been attempting to follow the Starting Spark jobs directly via YARN REST API Tutorial from Hortonworks.
Essentially what you need to do is upload your jar to HDFS, create a Spark Job JSON file per the YARN REST API Documentation, and then use a curl command to start the application. An example of that command is: