Hadoop “Unable to load native-hadoop library for y

2020-01-22 11:56发布

I'm currently configuring hadoop on a server running CentOs. When I run start-dfs.sh or stop-dfs.sh, I get the following error:

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

I'm running Hadoop 2.2.0.

Doing a search online brought up this link: http://balanceandbreath.blogspot.ca/2013/01/utilnativecodeloader-unable-to-load.html

However, the contents of /native/ directory on hadoop 2.x appear to be different so I am not sure what to do.

I've also added these two environment variables in hadoop-env.sh:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/"

export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/"

Any ideas?

20条回答
小情绪 Triste *
2楼-- · 2020-01-22 12:10

For installing Hadoop it is soooooo much easier installing the free version from Cloudera. It comes with a nice GUI that makes it simple to add nodes, there is no compiling or stuffing around with dependencies, it comes with stuff like hive, pig etc.

http://www.cloudera.com/content/support/en/downloads.html

Steps are: 1) Download 2) Run it 3) Go to web GUI (1.2.3.4:7180) 4) Add extra nodes in the web gui (do NOT install the cloudera software on other nodes, it does it all for you) 5) Within the web GUI go to Home, click Hue and Hue Web UI. This gives you access to Hive, Pig, Sqoop etc.

查看更多
萌系小妹纸
3楼-- · 2020-01-22 12:11

I had the same issue. It's solved by adding following lines in .bashrc:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
查看更多
Explosion°爆炸
4楼-- · 2020-01-22 12:13

This also would work:

export LD_LIBRARY_PATH=/usr/lib/hadoop/lib/native
查看更多
劳资没心,怎么记你
5楼-- · 2020-01-22 12:14

I'm not using CentOS. Here is what I have in Ubuntu 16.04.2, hadoop-2.7.3, jdk1.8.0_121. Run start-dfs.sh or stop-dfs.sh successfully w/o error:

# JAVA env
#
export JAVA_HOME=/j01/sys/jdk
export JRE_HOME=/j01/sys/jdk/jre

export PATH=${JAVA_HOME}/bin:${JRE_HOME}/bin:${PATH}:.

# HADOOP env
#
export HADOOP_HOME=/j01/srv/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME

export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin

Replace /j01/sys/jdk, /j01/srv/hadoop with your installation path

I also did the following for one time setup on Ubuntu, which eliminates the need to enter passwords for multiple times when running start-dfs.sh:

sudo apt install openssh-server openssh-client
ssh-keygen -t rsa
ssh-copy-id user@localhost

Replace user with your username

查看更多
Ridiculous、
6楼-- · 2020-01-22 12:15

Move your compiled native library files to $HADOOP_HOME/lib folder.

Then set your environment variables by editing .bashrc file

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib  
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib"

Make sure your compiled native library files are in $HADOOP_HOME/lib folder.

it should work.

查看更多
该账号已被封号
7楼-- · 2020-01-22 12:17

After a continuous research as suggested by KotiI got resolved the issue.

hduser@ubuntu:~$ cd /usr/local/hadoop

hduser@ubuntu:/usr/local/hadoop$ ls

bin  include  libexec      logs        README.txt  share
etc  lib      LICENSE.txt  NOTICE.txt  sbin

hduser@ubuntu:/usr/local/hadoop$ cd lib

hduser@ubuntu:/usr/local/hadoop/lib$ ls
native

hduser@ubuntu:/usr/local/hadoop/lib$ cd native/

hduser@ubuntu:/usr/local/hadoop/lib/native$ ls

libhadoop.a       libhadoop.so        libhadooputils.a  libhdfs.so
libhadooppipes.a  libhadoop.so.1.0.0  libhdfs.a         libhdfs.so.0.0.0

hduser@ubuntu:/usr/local/hadoop/lib/native$ sudo mv * ../

Cheers

查看更多
登录 后发表回答