hadoop 2.2.0 64-bit installing but cannot start

2019-01-16 11:36发布

I am trying to install Hadoop 2.2.0 Cluster on the servers. For now all the servers are 64-bit, I download the Hadoop 2.2.0 and all the configuration files have been set up. When I am running ./start-dfs.sh, I got the following error:

13/11/15 14:29:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hchen/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.namenode]
sed: -e expression #1, char 6: unknown option to `s' have: ssh: Could not resolve hostname have: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
The authenticity of host 'namenode (192.168.1.62)' can't be established.
RSA key fingerprint is 65:f9:aa:7c:8f:fc:74:e4:c7:a2:f5:7f:d2:cd:55:d4.
Are you sure you want to continue connecting (yes/no)? VM: ssh: Could not resolve        hostname VM: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
...

Beside the 64-bit, is there any other errors? I have finished the log in between namenode and datanodes without password, what do the other errors mean?

标签: hadoop
8条回答
贪生不怕死
2楼-- · 2019-01-16 12:07

The issue is not with the native library. Please see that its just a warning . Please export the hadoop variables mentioned above . That will work

查看更多
Juvenile、少年°
3楼-- · 2019-01-16 12:09

Add the following entries to .bashrc where HADOOP_HOME is your hadoop folder:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

In addition, execute the following commands:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
查看更多
虎瘦雄心在
4楼-- · 2019-01-16 12:14

I think that the only problem here is the same as in this question, so the solution is also the same:


Stop JVM from printing the stack guard warning to stdout/stderr, because this is what breaks the HDFS starting script.


Do it by replacing in your etc/hadoop/hadoop-env.sh line:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"

with:

export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"


(This solution has been found on Sumit Chawla's blog)

查看更多
别忘想泡老子
5楼-- · 2019-01-16 12:14

I had the similar problem & could not solve it after following all the above suggestions.

Finally understood that, the hostname configured and IP address is not assigned for the same.

My hostname was vagrant and it is configured in /etc/hostname. But I found that the IP address for the vagrant is not assigned in /etc/hosts. In /etc/hosts I found IP address for only localhost.

Once I updated the hostname for both localhost and vagrant all the above problems are resolved.

查看更多
SAY GOODBYE
6楼-- · 2019-01-16 12:17

The root cause is that the default native library in hadoop is built for 32-bit. The solution

1) Setup some environment variables in .bash_profile. Please refer to https://gist.github.com/ruo91/7154697 Or

2) Rebuild your hadoop native library, please refer to http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html

查看更多
可以哭但决不认输i
7楼-- · 2019-01-16 12:26

You also can export variables in hadoop-env.sh

vim /usr/local/hadoop/etc/hadoop/hadoop-env.sh

/usr/local/hadoop - my hadoop installation folder

#Hadoop variables
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64 # your jdk install path
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
查看更多
登录 后发表回答