hadoop: configuration files

2019-08-27 16:15发布

问题:

I am new to Hadoop. I am trying to setup a single-node cluster.

I have noticed that in the documentation i've read (even on Apache's config site) it always refers to the configuration files in conf/ directory. However, when i download version 2.X.X i only see config files in the etc/hadoop directory.

I have googled the heck out of this. i tried reading the hadoop documentation, but it refers to the 'conf' directory, as explained before.

So, my question is: Do i just configure the files where they are, in the etc/hadoop directory, or do i need to move them to the conf directory (create it myself?).

thanks

回答1:

In Hadoop 2, the etc/hadoop directory itself is indeed the conf directory, no need to create a separate one. A quick way to test this is just to modify something like fs.default.name between file:/// and your hdfs://host:port/ setting and run a quick "hadoop fs -ls" to see where you end up.



回答2:

Extract from Hadoop: The definitive guide--

"In Hadoop 2.0 and later, MapReduce runs on YARN and there is an additional con- figuration file called yarn-site.xml. All the configuration files should go in the etc/hadoop subdirectory"

So you do not have to make a new conf directory..



回答3:

Hadoop version 2.x.x is apache yarn and it has configuration in etc/hadoop/ directory.I will recommend you to follow the settings give on Apache YARN site here