Value for HADOOP_CONF_DIR from Cluster

2019-02-17 14:33发布

I have setup a cluster(YARN) using Ambari with 3 VMs as hosts.

Where I can find the value for HADOOP_CONF_DIR ?

# Run on a YARN cluster
export HADOOP_CONF_DIR=XXX
./bin/spark-submit \
  --class org.apache.spark.examples.SparkPi \
  --master yarn-cluster \  # can also be `yarn-client` for client mode
  --executor-memory 20G \
  --num-executors 50 \
  /path/to/examples.jar \
  1000

2条回答
Viruses.
2楼-- · 2019-02-17 14:51

Install Hadoop as well. In my case I've installed it in /usr/local/hadoop

Setup Hadoop Environment Variables

export HADOOP_INSTALL=/usr/local/hadoop

Then set the conf directory

export HADOOP_CONF_DIR=$HADOOP_INSTALL/etc/hadoop
查看更多
该账号已被封号
3楼-- · 2019-02-17 15:12

From /etc/spark/conf/spark-env.sh:

export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-/etc/hadoop/conf}
查看更多
登录 后发表回答