I have downloaded hadoop
source code from github and compiled with the native
option:
mvn package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true
I then copied the .dylib
files to the $HADOOP_HOME/lib
cp -p hadoop-common-project/hadoop-common/target/hadoop-common-2.7.1/lib/native/*.dylib /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/lib
The LD_LIBRARY_PATH was updated and hdfs restarted:
echo $LD_LIBRARY_PATH
/usr/local/Cellar/hadoop/2.7.2/libexec/lib:
/usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/lib:/Library/Java/JavaVirtualMachines/jdk1.8.0_92.jdk/Contents/Home//jre/lib
(Note: this also means that the answer to Hadoop “Unable to load native-hadoop library for your platform” error on docker-spark? does not work for me..)
But checknative
still returns uniformly false
:
$stop-dfs.sh && start-dfs.sh && hadoop checknative
16/06/13 16:12:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Stopping namenodes on [sparkbook]
sparkbook: stopping namenode
localhost: stopping datanode
Stopping secondary namenodes [0.0.0.0]
0.0.0.0: stopping secondarynamenode
16/06/13 16:12:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/13 16:12:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [sparkbook]
sparkbook: starting namenode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-namenode-sparkbook.out
localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-datanode-sparkbook.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-secondarynamenode-sparkbook.out
16/06/13 16:13:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/13 16:13:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Native library checking:
hadoop: false
zlib: false
snappy: false
lz4: false
bzip2: false
openssl: false
To get this working on a fresh install of macOS 10.12, I had to do the following:
Install build dependencies using homebrew:
Check out hadoop source code
Apply the below patch to the build:
Build hadoop from source:
Specify
JAVA_LIBRARY_PATH
when running hadoop:The needed step is to copy the
*.dylib
from thegit
sources build dir into the$HADOOP_HOME/<common dir>lib
dir for your platform . ForOS/X
installed viabrew
it is:We can see the required libs there now:
And now the
hadoop checknative
command works:There are some missing steps in @andrewdotn's response above:
1) For step (3), create the patch by adding the text posted to a text file e.g. "patch.txt", and then execute "git apply patch.txt"
2) In addition to copying the files as directed by javadba, certain applications also require that you set:
As an update to @andrewdotn answer, here is the
patch.txt
file to be used with Hadoop 2.8.1: