There are two settings that control the number of retries (i.e. the maximum number of ApplicationMaster registration attempts with YARN is considered failed and hence the entire Spark application):
spark.yarn.maxAppAttempts - Spark's own setting. See MAX_APP_ATTEMPTS:
private[spark] val MAX_APP_ATTEMPTS = ConfigBuilder("spark.yarn.maxAppAttempts")
.doc("Maximum number of AM attempts before failing the app.")
.intConf
.createOptional
yarn.resourcemanager.am.max-attempts - YARN's own setting with default being 2.
(As you can see in YarnRMClient.getMaxRegAttempts) the actual number is the minimum of the configuration settings of YARN and Spark with YARN's being the last resort.
There are two settings that control the number of retries (i.e. the maximum number of
ApplicationMaster
registration attempts with YARN is considered failed and hence the entire Spark application):spark.yarn.maxAppAttempts
- Spark's own setting. See MAX_APP_ATTEMPTS:yarn.resourcemanager.am.max-attempts
- YARN's own setting with default being 2.(As you can see in YarnRMClient.getMaxRegAttempts) the actual number is the minimum of the configuration settings of YARN and Spark with YARN's being the last resort.
An API/programming language-agnostic solution would be to set the yarn max attempts as a command line argument:
See @code 's answer
Add the property
yarn.resourcemanager.am.max-attempts
to your yarn-default.xml file. It specifies the maximum number of application attempts.For more details look into this link