不能使用的IntelliJ本地连接到HDFS采用Kerberos集群(Cannot connect

2019-09-25 19:11发布

荫试图通过安装在我的laptop.The簇我'尝试连接到的基于Kerberos与边缘节点的IntelliJ本地连接到HDFS。 我生成边缘节点密钥表并且被配置,在下面的代码。 荫现在能够登录到edgenode。 但是,当我现在尝试访问HDFS数据是在它抛出一个错误的名称节点。 下面是试图连接到HDFS的Scala代码:

import org.apache.spark.sql.SparkSession
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.{FileSystem, Path}
import org.apache.hadoop.security.{Credentials, UserGroupInformation}
import org.apache.hadoop.security.token.{Token, TokenIdentifier}
import java.security.{AccessController, PrivilegedAction, PrivilegedExceptionAction}
import java.io.PrintWriter

object DataframeEx {
  def main(args: Array[String]) {
    // $example on:init_session$
    val spark = SparkSession
      .builder()
      .master(master="local")
      .appName("Spark SQL basic example")
      .config("spark.some.config.option", "some-value")
      .getOrCreate()

    runHdfsConnect(spark)

    spark.stop()
  }

   def runHdfsConnect(spark: SparkSession): Unit = {

    System.setProperty("HADOOP_USER_NAME", "m12345")
    val path = new Path("/data/interim/modeled/abcdef")
    val conf = new Configuration()
    conf.set("fs.defaultFS", "hdfs://namenodename.hugh.com:8020")
    conf.set("hadoop.security.authentication", "kerberos")
    conf.set("dfs.namenode.kerberos.principal.pattern","hdfs/_HOST@HUGH.COM")

    UserGroupInformation.setConfiguration(conf);
    val ugi=UserGroupInformation.loginUserFromKeytabAndReturnUGI("m12345@HUGH.COM","C:\\Users\\m12345\\Downloads\\m12345.keytab");

    println(UserGroupInformation.isSecurityEnabled())
     ugi.doAs(new PrivilegedExceptionAction[String] {
       override def run(): String = {
         val fs= FileSystem.get(conf)
         val output = fs.create(path)
         val writer = new PrintWriter(output)
         try {
           writer.write("this is a test")
           writer.write("\n")
         }
         finally {
           writer.close()
           println("Closed!")
         }
          "done"
       }
     })
  }
}

荫能够登录到edgenode。 但是,当荫试图写入HDFS(波达方向法)它引发以下错误:

WARN Client: Exception encountered while connecting to the server : java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM
18/06/11 12:12:01 ERROR UserGroupInformation: PriviledgedActionException m12345@HUGH.COM (auth:KERBEROS) cause:java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM
18/06/11 12:12:01 ERROR UserGroupInformation: PriviledgedActionException as:m12345@HUGH.COM (auth:KERBEROS) cause:java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM; Host Details : local host is: "INMBP-m12345/172.29.155.52"; destination host is: "namenodename.hugh.com":8020; 
Exception in thread "main" java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM; Host Details : local host is: "INMBP-m12345/172.29.155.52"; destination host is: "namenodename.hugh.com":8020

如果我登录到edgenode,做一个kinit命令,然后访问HDFS的罚款。 那么,为什么我不能够访问HDFS的NameNode时荫能够登录到edgenode?

让我知道,如果从我的身边需要的更多细节。

Answer 1:

的Spark CONF对象被设置不正确。 以下是我工作:

val conf = new Configuration()
conf.set("fs.defaultFS", "hdfs://namenodename.hugh.com:8020")
conf.set("hadoop.security.authentication", "kerberos")
conf.set("hadoop.rpc.protection", "privacy")   ***---(was missing this parameter)***
conf.set("dfs.namenode.kerberos.principal","hdfs/_HOST@HUGH.COM") ***---(this was initially wrongly set as dfs.namenode.kerberos.principal.pattern)***


文章来源: Cannot connect locally to hdfs kerberized cluster using IntelliJ