I use Hadoop Streaming to execute a script file in tcsh:
-- First copy the jar files to the Hadoop filesystem,
so that they are next to inputdir and outputdir. --
cp App/* /hadoop/jardir/
-- Run Hadoop --
hadoop jar /usr/lib/hadoop/contrib/streaming/hadoop-streaming-0.20.2-cdh3u4.jar \\
-D mapred.task.timeout=120000000 \\
-input "/hadoop/inputdir/" -output "/hadoop/outputdir/" \\
-mapper script.sh -reducer script.sh -file script.sh \\
-jobconf mapred.map.tasks=1 -jobconf mapred.reduce.tasks=0 >>& log.txt
This script file invokes Java like this:
java -cp /hadoop/jardir/SomeJavaApp.jar:/hadoop/jardir/* some.JavaApplication
Even though I explicitly tell Java the classpath, running it fails with:
Exception in thread "main" java.lang.NoClassDefFoundError: some/JavaApplication
Caused by: java.lang.ClassNotFoundException: some.JavaApplication
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: some.JavaApplication. Program will exit.
I already tried various suggestions like setting Hadoops -libjars
parameter, but that didn't help.
How can I run a Java application via a script when using Hadoop Streaming?
I think the problem is that your jar is not copied to the distributed cache. Try to ship the jar along with the shell script via the
-files
option. E.g:I assume that you are running your streaming job from the directory where the
hadooptest.jar
andrunjava.sh
are stored.runjava.sh:
StreamTest.java :
This example works fine on version 0.20-append-r1056497.