Spark Kill Running Application

2019-01-30 02:23发布

问题:

I have a running Spark application where it occupies all the cores where my other applications won't be allocated any resource.

I did some quick research and people suggested using YARN kill or /bin/spark-class to kill the command. However, I am using CDH version and /bin/spark-class doesn't even exist at all, YARN kill application doesn't work either.

Can anyone with me with this?

回答1:

  • copy past the application Id from the spark scheduler, for instance application_1428487296152_25597
  • connect to the server that have launch the job
  • yarn application -kill application_1428487296152_25597


回答2:

https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html#Cluster_Application_State_API

PUT http://{rm http address:port}/ws/v1/cluster/apps/{appid}/state

{
  "state":"KILLED"
}