Working With Hadoop: localhost: Error: JAVA_HOME i

2019-01-21 13:50发布

I'm working with Ubuntu 12.04 LTS.

I'm going through the hadoop quickstart manual to make a pseudo-distributed operation. It seems simple and straightforward (easy!).

However, when I try to run start-all.sh I get:

localhost: Error: JAVA_HOME is not set.

I've read all the other advice on stackoverflow for this issue and have done the following to ensure JAVA_HOME is set:

In /etc/hadoop/conf/hadoop-env.sh I have set

JAVA_HOME=/usr/lib/jvm/java-6-oracle
export JAVA_HOME

In /etc/bash.bashrc I have set

JAVA_HOME=/usr/lib/jvm/java-6-oracle
export JAVA_HOME
PATH=$PATH:$JAVA_HOME/bin
export PATH

which java returns:

/usr/bin/java

java –version works

echo $JAVA_HOME returns:

/usr/lib/jvm/java-6-oracle

I've even tried becoming root and explicitly writing the in the terminal:

$ JAVA_HOME=/usr/lib/jvm/java-6-oracle
$ export JAVA_HOME
$ start-all.sh

If you could show me how to resolve this error it would be greatly appreciated. I'm thinking that my JAVA_HOME is being overridden somehow. If that is the case, could you explain to me how to make my exports global?

11条回答
来,给爷笑一个
2楼-- · 2019-01-21 14:25

This error is coming from Line 180

if [[ -z $JAVA_HOME ]]; then
   echo "Error: JAVA_HOME is not set and could not be found." 1>&2
   exit 1
fi

in libexec/hadoop-config.sh.

Try echo $JAVA_HOME in that script. If it doesn't recognize,

Find your JAVA_HOME using this:

$(readlink -f /usr/bin/javac | sed "s:/bin/javac::")

and replace the line

export JAVA_HOME=${JAVA_HOME} in /etc/hadoop/hadoop-env.sh with JAVA_HOME you got from above command.

查看更多
ゆ 、 Hurt°
3楼-- · 2019-01-21 14:29

I also had faced the similar problem in hadoop 1.1 I had not noticed that the JAVA_HOME was commented in: hadoop/conf/hadoop-env.sh

It was

/#JAVA_HOME=/usr/lib/jvm/java-6-oracle

Had to change it to

JAVA_HOME=/usr/lib/jvm/java-6-oracle
查看更多
姐就是有狂的资本
4楼-- · 2019-01-21 14:32

I am using hadoop 1.1, and faced the same problem.

I got it solved through changing JAVA_HOME variable in /etc/hadoop/hadoop-env.sh as:

export JAVA_HOME=/usr/lib/jvm/<jdk folder>
查看更多
来,给爷笑一个
5楼-- · 2019-01-21 14:33

The way to debug this is to put an "echo $JAVA_HOME" in start-all.sh. Are you running your hadoop environment under a different username, or as yourself? If the former, it's very likely that the JAVA_HOME environment variable is not set for that user.

The other potential problem is that you have specified JAVA_HOME incorrectly, and the value that you have provided doesn't point to a JDK/JRE. Note that "which java" and "java -version" will both work, even if JAVA_HOME is set incorrectly.

查看更多
神经病院院长
6楼-- · 2019-01-21 14:35

The way to solve this problem is to export the JAVA_HOME variable inside the conf/hadoop-env.sh file.

It doesn't matter if you already exported that variable in ~/.bashrc, it'll still show the error.

So edit conf/hadoop-env.sh and uncomment the line "export JAVA_HOME" and add a proper filesystem path to it, i.e. the path to your Java JDK.

# The Java implementation to use. Required.
export JAVA_HOME="/path/to/java/JDK/"

查看更多
三岁会撩人
7楼-- · 2019-01-21 14:37

regardless of debian or any linux flavor, just know that ~/.bash_profile belongs to specific user and is not system wide. in pseudo-distributed environment hadoop works on localhost so the $JAVA_HOME in .bash_profile is no use anymore.

just export the JAVA_HOME in ~/.bashrc and use it system wide.

查看更多
登录 后发表回答