hadoop - hadoop HA的Kerberos集成问题

标签 hadoop kerberos

我通过使用cdh 5设置了hadoop ha并尝试将kerberos与它集成。我可以启动成功安装kerberos kdc的namenode。但是第二次namenode启动并出现错误消息。

java.io.IOException: Login failure for hdfs/rhel3.had.com@had.com from keytab /etc/hadoop/conf/hdfs.keytab



2015-02-18 16:24:27,391 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use mycluster to access this namenode/service.
2015-02-18 16:24:28,220 FATAL org.apache.hadoop.hdfs.server.namenode.NameNode: Failed to start namenode.
java.io.IOException: Login failure for hdfs/rhel3.had.com@had.com from keytab /etc/hadoop/conf/hdfs.keytab
        at org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:947)
        at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:242)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.loginAsNameNodeUser(NameNode.java:560)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:579)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:754)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:738)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1427)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1493)
Caused by: javax.security.auth.login.LoginException: Unable to obtain password from user



kinit works in the name node



[root@rhel3 ~]# kinit -kt /etc/hadoop/conf/hdfs.keytab hdfs/rhel3.had.com
[root@rhel3 ~]# klist -a
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: hdfs/rhel3.had.com@had.com

Valid starting     Expires            Service principal
02/18/15 19:47:52  02/19/15 19:47:52  krbtgt/had.com@had.com
        renew until 02/18/15 19:47:52
        Addresses: (none)
[root@rhel3 ~]#



hdfs-site.xml:

<property>
  <name>dfs.block.access.token.enable</name>
  <value>true</value>
</property>

<!-- NameNode security config -->
<property>
  <name>dfs.namenode.keytab.file</name>
  <value>/etc/hadoop/conf/hdfs.keytab</value> <!-- path to the HDFS keytab -->
</property>
<property>
  <name>dfs.namenode.kerberos.principal</name>
  <value>hdfs/rhel3.had.com@had.com</value>
</property>
<property>
  <name>dfs.namenode.kerberos.internal.spnego.principal</name>
  <value>HTTP/rhel3.had.com@had.com</value>
</property>

<property>
  <name>dfs.webhdfs.enabled</name>
  <value>true</value>
</property>
<property>
  <name>dfs.web.authentication.kerberos.principal</name>
  <value>HTTP/rhel3.had.com@had.com</value>
</property>

<property>
  <name>dfs.web.authentication.kerberos.keytab</name>
  <value>/etc/hadoop/conf/hdfs.keytab</value> <!-- path to the HTTP keytab -->
</property>

core-site.xml:
<property>
  <name>hadoop.security.authentication</name>
  <value>kerberos</value> <!-- A value of "simple" would disable security. -->
</property>

<property>
  <name>hadoop.security.authorization</name>
  <value>true</value>
</property>

[root@rhel3 ~]# kinit -kt /etc/hadoop/conf/hdfs.keytab hdfs/rhel3.had.com
[root@rhel3 ~]# klist -ef
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: hdfs/rhel3.had.com@had.com

Valid starting     Expires            Service principal
02/19/15 17:26:33  02/20/15 17:26:32  krbtgt/had.com@had.com
        renew until 02/19/15 17:26:33, Flags: FRI
        Etype (skey, tkt): aes256-cts-hmac-sha1-96, aes256-cts-hmac-sha1-96
[root@rhel3 ~]#

请让我知道如何解决此问题。

最佳答案

我遇到了这个问题,就我而言,它也会报告此问题:Kerberos: check sum failed issue

我通过将FQDN应用于所有地址来修复它。

关于hadoop - hadoop HA的Kerberos集成问题,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/28622176/

相关文章:

java - SPNEGO身份验证可通过自定义Java客户端运行,但不能通过Web浏览器运行

c - 哪些程序使用 GSS-API?有什么不错的示例程序吗?

python - HTTP Negotiate windows vs. Unix 服务器实现使用 python-kerberos

hadoop - HIVE: 'LIMIT' 上的 'SELECT * from' 如何在后台工作?

java - 在 Reducer 中查找最常见的键,错误 : java. lang.ArrayIndexOutOfBoundsException:1

hadoop - 配置单元初始化日志后,配置单元作业卡住了

java - 发现 nn/hadoop-kerberos@HADOOP-KERBEROS 不支持的 key 类型 (8)

java - HBase with Kerberos - 保持 HTable 实例打开超过 10 小时

java - 在 Hadoop 中,框架在哪里保存普通 Map-Reduce 应用程序中 Map 任务的输出?

hadoop - elasticsearch只插入了10个文档