Hadoop installation: namenode cannot be started

2019-05-23 01:36发布

问题:

Currently I am trying to install hadoop-2.6.0 on my ubuntu 14.10 (32 bit utopic). I followed the instruction from here:

http://www.itzgeek.com/how-tos/linux/ubuntu-how-tos/install-apache-hadoop-ubuntu-14-10-centos-7-single-node-cluster.html#axzz3X2DuWaxQ

However, namenode cannot be started when i try to format namenode.

This is the what I keep receiving when I try to do hdfs or hadoop namenode -format:

15/04/11 16:32:13 FATAL namenode.NameNode: Fialed to start namenode
java.lang.IllegalArgumentException: URI has an authority component
    at java.io.File.<init>(File.java:423)
    at             org.apache.hadoop.hdfs.server.namenode.NNSStorage.getStorageDirectory(NNStorage.java:329)
    at
org.apache.hadoop.hdfs.server.namenode.FSEditLog.initJournals(FSEditLog.java: 270)
    at
org.apache.hadoop.hdfs.server.namenode.FSEditLog.initJournalsForWrite(FSEditLog.java:241)
    at     org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:935)
    at     org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1379)
    at     org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1504)
15/04/11 16:32:13 INFO util.ExitUtil: Exiting with status 1
15/04/11 16:32:14 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at ThinkPad-Edge-E540/127.0.1.1
************************************************************/

I am new to linux and hadoop. Please help me out on this issue. Also, when i first tried to install hadoop, I was receiving error message like this:

    java.net.ConnectException: Call From ThinkPad-Edge-E540/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Later, I uninstalled hadoop 2.6.0 and now am trying to follow current instruction as shown in above link.

update

I have removed all previous installed java (jdk1.7.0) that I installed in previous version. But the error message is still there.

Update

This is what showing in my etc/hosts:

127.0.0.1 localhost
127.0.1.1 myname-mycomputer (I have commented out this line per suggestion)

#The following lines are desirable for IPv6 capable hosts
::1       ip6-localhost  ip6-loopback
fe00::0   ip6-localnet
ff00:0    ip6-mcastprefix
ff02::1   ip6-allnodes
ff02::2   ip6-allrouters

回答1:

This problem arise when mistaken I specified wrong path for namenode and datanode in hdfs-site.xml and tmp dir path in core-site.xml, Path should be well formatted, for example-

<property>
    <name>dfs.namenode.edits.dir</name>
    <value>file:///home/hadoop/hadoop-content/hdfs/namenode</value>
</property>

<property>
    <name>dfs.datanode.data.dir</name>
    <value>file:///home/hadoop/hadoop-content/hdfs/datanode</value>
</property>

and for temp dir in core-site.xml it is like -

<configuration>
<property>
    <name>fs.defaultFS</name>
    <value>hdfs://localhost:9000</value>
</property>
<property>
    <name>hadoop.tmp.dir</name>
    <value>/home/hadoop/hadoop-content/tmp</value>
</property>

sometimes we make mistake in specifying - file:///



回答2:

In /etc/hosts:

1. Add this line:

your-ip-address    your-host-name

example: 192.168.1.8 master

In /etc/hosts:

2. Delete the line with 127.0.1.1 (This will cause loopback)

3. In your core-site, change localhost to your-ip or your-hostname

Now, restart the cluster.