I'm currently configuring hadoop on a server running CentOs. When I run start-dfs.sh
or stop-dfs.sh
, I get the following error:
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
I'm running Hadoop 2.2.0.
Doing a search online brought up this link: http://balanceandbreath.blogspot.ca/2013/01/utilnativecodeloader-unable-to-load.html
However, the contents of /native/
directory on hadoop 2.x appear to be different so I am not sure what to do.
I've also added these two environment variables in hadoop-env.sh
:
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/"
export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/"
Any ideas?
For those on OSX with Hadoop installed via Homebrew, follow these steps replacing the path and Hadoop version where appropriate
then update hadoop-env.sh with
@zhutoulala -- FWIW your links worked for me with Hadoop 2.4.0 with one exception I had to tell maven not to build the javadocs. I also used the patch in the first link for 2.4.0 and it worked fine. Here's the maven command I had to issue
After building this and moving the libraries, don't forget to update hadoop-env.sh :)
Thought this might help someone who ran into the same roadblocks as me
I assume you're running Hadoop on 64bit CentOS. The reason you saw that warning is the native Hadoop library
$HADOOP_HOME/lib/native/libhadoop.so.1.0.0
was actually compiled on 32 bit.Anyway, it's just a warning, and won't impact Hadoop's functionalities.
Here is the way if you do want to eliminate this warning, download the source code of Hadoop and recompile
libhadoop.so.1.0.0
on 64bit system, then replace the 32bit one.Steps on how to recompile source code are included here for Ubuntu:
Good luck.
This line right here:
From KunBetter's answer is where the money is
The answer depends... I just installed Hadoop 2.6 from tarball on 64-bit CentOS 6.6. The Hadoop install did indeed come with a prebuilt 64-bit native library. For my install, it is here:
And I know it is 64-bit:
Unfortunately, I stupidly overlooked the answer right there staring me in the face as I was focuses on, "Is this library 32 pr 64 bit?":
So, lesson learned. Anyway, the rest at least led me to being able to suppress the warning. So I continued and did everything recommended in the other answers to provide the library path using the HADOOP_OPTS environment variable to no avail. So I looked at the source code. The module that generates the error tells you the hint (util.NativeCodeLoader):
So, off to here to see what it does:
http://grepcode.com/file/repo1.maven.org/maven2/com.ning/metrics.action/0.2.6/org/apache/hadoop/util/NativeCodeLoader.java/
Ah, there is some debug level logging - let's turn that on a see if we get some additional help. This is done by adding the following line to $HADOOP_CONF_DIR/log4j.properties file:
Then I ran a command that generates the original warning, like stop-dfs.sh, and got this goodie:
And the answer is revealed in this snippet of the debug message (the same thing that the previous ldd command 'tried' to tell me:
What version of GLIBC do I have? Here's simple trick to find out:
So, can't update my OS to 2.14. Only solution is to build the native libraries from sources on my OS or suppress the warning and just ignore it for now. I opted to just suppress the annoying warning for now (but do plan to build from sources in the future) buy using the same logging options we used to get the debug message, except now, just make it ERROR level.
I hope this helps others see that a big benefit of open source software is that you can figure this stuff out if you take some simple logical steps.
I had the same problem with JDK6,I changed the JDK to JDK8,the problem solved. Try to use JDK8!!!