After running my spark application, I want to monitor its memory and cpu usage to evaluate its performance but couldn't find any option. Is it possible to monitor it? How can I monitor memory and CPU usage by spark application?
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
回答1:
There are a few options:
- Ganglia is one
- If you're running on your own cluster, HDP or Cloudera both have real time CPU & memory consumption charts.
- If you want specific JVM metrics, then I'd recommend FlameGraph, though it's not real time.
- There's also Grafana, it's extremely powerful, you can track many metrics with it, and it's real time.