I'm currently configuring hadoop on a server running CentOs. When I run start-dfs.sh
or stop-dfs.sh
, I get the following error:
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
I'm running Hadoop 2.2.0.
Doing a search online brought up this link: http://balanceandbreath.blogspot.ca/2013/01/utilnativecodeloader-unable-to-load.html
However, the contents of /native/
directory on hadoop 2.x appear to be different so I am not sure what to do.
I've also added these two environment variables in hadoop-env.sh
:
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/"
export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/"
Any ideas?
For installing Hadoop it is soooooo much easier installing the free version from Cloudera. It comes with a nice GUI that makes it simple to add nodes, there is no compiling or stuffing around with dependencies, it comes with stuff like hive, pig etc.
http://www.cloudera.com/content/support/en/downloads.html
Steps are: 1) Download 2) Run it 3) Go to web GUI (1.2.3.4:7180) 4) Add extra nodes in the web gui (do NOT install the cloudera software on other nodes, it does it all for you) 5) Within the web GUI go to Home, click Hue and Hue Web UI. This gives you access to Hive, Pig, Sqoop etc.
I had the same issue. It's solved by adding following lines in
.bashrc
:This also would work:
I'm not using CentOS. Here is what I have in Ubuntu 16.04.2, hadoop-2.7.3, jdk1.8.0_121. Run start-dfs.sh or stop-dfs.sh successfully w/o error:
Replace /j01/sys/jdk, /j01/srv/hadoop with your installation path
I also did the following for one time setup on Ubuntu, which eliminates the need to enter passwords for multiple times when running start-dfs.sh:
Replace user with your username
Move your compiled native library files to
$HADOOP_HOME/lib
folder.Then set your environment variables by editing
.bashrc
fileMake sure your compiled native library files are in
$HADOOP_HOME/lib
folder.it should work.
After a continuous research as suggested by KotiI got resolved the issue.
Cheers