What is the right way to edit spark-env.sh before

2019-03-21 23:58发布

I am running spark on my local windows machine. I am able to start spark shell successfully.

I want to edit the spark-env.sh file residing in conf/ folder. What is the right way to add values to the spark-env.sh file.

E.g If I want to add value to SPARK_EXECUTOR_MEMORY variable how to do it? Am getting confused between different answers that are available 1. SPARK_EXECUTOR_MEMORY="2G" 2. export

2条回答
走好不送
2楼-- · 2019-03-22 00:37

You must have to use export to add any configuration in *.sh file. So in spark-env.sh file use following example,

export SPARK_MASTER_IP=192.165.5.1
export SPARK_EXECUTOR_MEMORY=2g
#OR export SPARK_EXECUTOR_MEMORY=2G

No need to use double quotes for values.

查看更多
Deceive 欺骗
3楼-- · 2019-03-22 00:39

The spark-env.sh is a regular bash script intended for Unix, so on a Windows installation it will never get picked up.

On Windows, you'll need to have a spark-env.cmd file in the conf directory and instead use the following syntax :

set SPARK_EXECUTOR_MEMORY=2G

On Unix, the file will be called spark-env.sh and you will need to preprend each of your properties with export (e.g. : export SPARK_EXECUTOR_MEMORY=2G)

查看更多
登录 后发表回答