I have installed and configured Hadoop 2.5.2 for a 10 node cluster. 1 is acting as masternode and other nodes as slavenodes.
I have problem in executing hadoop fs commands. hadoop fs -ls command is working fine with HDFS URI. It gives message "ls: `.': No such file or directory" when used without HDFS URI
ubuntu@101-master:~$ hadoop fs -ls
15/01/30 17:03:49 WARN util.NativeCodeLoader: Unable to load native-hadoop
ibrary for your platform... using builtin-java classes where applicable
ls: `.': No such file or directory
ubuntu@101-master:~$
Whereas, executing the same command with HDFS URI
ubuntu@101-master:~$ hadoop fs -ls hdfs://101-master:50000/
15/01/30 17:14:31 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Found 3 items
drwxr-xr-x - ubuntu supergroup 0 2015-01-28 12:07 hdfs://101-master:50000/hvision-data
-rw-r--r-- 2 ubuntu supergroup 15512587 2015-01-28 11:50 hdfs://101-master:50000/testimage.seq
drwxr-xr-x - ubuntu supergroup 0 2015-01-30 17:03 hdfs://101-master:50000/wrodcount-in
ubuntu@101-master:~$
I am getting exception in MapReduce program due to this behavior. jarlib is referring to the HDFS file location, whereas, I want jarlib to refer to the jar files stored at the local file system on the Hadoop nodes.
There are a couple things at work here; based on "jarlib is referring to the HDFS file location", it sounds like you indeed have an HDFS path set as your
fs.default.name
, which is indeed the typical setup. So, when you typehadoop fs -ls
, this is indeed trying to look inside HDFS, except it's looking in your current working directory, which should be something likehdfs://101-master:50000/user/ubuntu
. The error message is unfortunately somewhat confusing since it doesn't tell you that.
was interpreted to be that full path. If youhadoop fs -mkdir /user/ubuntu
thenhadoop fs -ls
should start working.This problem is unrelated to your "jarlib" problem; whenever you want to refer files explicitly stored in the local filesystem, but where the path goes through Hadoop's
Path
resolution, you simply need to addfile:///
to force Hadoop to refer to the local filesystem. For example:Try passing your jar file paths as fille
file:///path/to/your/jarfile
and it should work.WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
This error will be removed using this command in .bashrc file:
The user directory in Hadoop is (in HDFS)
If you get this error message it may be because you have not yet created your user directory within HDFS.
Use
To see what is your current operational system user, use:
hadoop fs -ls
it should start working...The behaviour that you are seeing is expected, let me explain what's going on when you are working with
hadoop fs
commands.The command's syntax is this:
hadoop fs -ls [path]
By default, when you don't specify
[path]
for the above command, hadoop expands the path to/home/[username]
in hdfs; where[username]
gets replaced with linux username who is executing the command.So, when you execute this command:
the reason you are seeing the error is
ls: '.': No such file or directory
because hadoop is looking for this path/home/ubuntu
, it seems like this path doesn't exist in hdfs.The reason why this command:
is working because, you have explicitly specified
[path]
and is the root of the hdfs. You can also do the same using this:which automatically gets evaluated to the root of hdfs.
Hope, this clears the behaviour you are seeing while executing
hadoop fs -ls
command.Hence, if you want to specify local file system path use
file:///
url scheme.this has to do with the missing home directory for the user. Once I created the home directory under the hdfs for the logged in user, it worked like a charm..
this method fixed my problem.