I have installed/configured Hadoop on windows hadoop-2.7.0
I could successfully start "sbin\start-dfs" run command. DataNode and NameNode started. I could create directory, add file into hadoop system.
But now when I try "sbin/start-yarn" on "resourcemanager" window I do not see error. But it failes on yarn's "namenode"
it fails with this error :-
15/06/21 17:26:49 INFO impl.MetricsConfig: loaded properties from hadoop-metrics
2.properties
15/06/21 17:26:49 INFO impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
15/06/21 17:26:49 INFO impl.MetricsSystemImpl: NodeManager metrics system started
15/06/21 17:26:49 FATAL nodemanager.NodeManager: Error starting NodeManager
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:473)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:526)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:504)
at org.apache.hadoop.fs.FileSystem.primitiveMkdir(FileSystem.java:1064)
at org.apache.hadoop.fs.DelegateToFileSystem.mkdir(DelegateToFileSystem.java:161)
at org.apache.hadoop.fs.FilterFs.mkdir(FilterFs.java:197)
at org.apache.hadoop.fs.FileContext$4.next(FileContext.java:730)
at org.apache.hadoop.fs.FileContext$4.next(FileContext.java:726)
at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
at org.apache.hadoop.fs.FileContext.mkdir(FileContext.java:726)
at org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection.createDir(DirectoryCollection.java:365)
at org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection.createNonExistentDirs(DirectoryCollection.java:199)
at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.serviceInit(LocalDirsHandlerService.java:152)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
at org.apache.hadoop.yarn.server.nodemanager.NodeHealthCheckerService.serviceInit(NodeHealthCheckerService.java:48)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceInit(NodeManager.java:254)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.initAndStartNodeManager(NodeManager.java:463)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.main(NodeManager.java:511)
15/06/21 17:26:49 INFO impl.MetricsSystemImpl: Stopping NodeManager metrics system...
15/06/21 17:26:49 INFO impl.MetricsSystemImpl: NodeManager metrics system stopped.
15/06/21 17:26:49 INFO impl.MetricsSystemImpl: NodeManager metrics system shutdown complete.
15/06/21 17:26:49 INFO nodemanager.NodeManager: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NodeManager at idea-PC/27.4.177.205
************************************************************/
I had faced similar problem while "sbin\start-dfs". I tried different things. It looks like it was solved when I added hadoop's "bin" and "sbin" in path.
Can you please suggest the solution for Yarn problem.