hadoop - 使用 Kerberos 安全连接到 Mapper 中的 HIVE

标签 hadoop mapreduce hive kerberos oozie

我的目标是在安全集群 (kerberos) HDP2.3 上运行 MapReduce 并连接到 Oozie 工作流调度程序中的 Hive。

我能够在直线中连接到配置单元,或者当我使用以下连接字符串将其作为 java 应用程序(yarn jar)运行时:

DriverManager.getConnection("jdbc:hive2://host:10000/;principal=hive/_HOST@REALM", "", "");

但是当我在 Mapper 中运行它时它失败了。

 ERROR [main] org.apache.thrift.transport.TSaslTransport: SASL negotiation failure
    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:190)
        ...
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
        at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
        at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
        at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
        at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)

如何让它在 Mapper 中工作?

最佳答案

它与 Hive 委托(delegate) token 一起工作:

  1. 好吧

    • 添加属性:

      hive2.server.principal=hive/_HOST@REALM
      hive2.jdbc.url=jdbc:hive2://{host}:10000/default
      
    • 将凭据设置为 hive2

    • 映射器示例:

      public class HiveMapperExample extends Mapper<LongWritable, Text, Text, Text> {
      
          @Override
          protected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
              try {
                  Class.forName("org.apache.hive.jdbc.HiveDriver");
                  Connection connect = DriverManager.getConnection("jdbc:hive2://{host}:10000/;auth=delegationToken", "", "");
                  Statement state = connect.createStatement();
                  ResultSet resultSet = state.executeQuery("select * from some_table");
                  while (resultSet.next()) {
                      ...
                  }
              } catch (Exception e) {
                  ...
              }
          }
       }
      
  2. 工具运行器

    public class HiveTestApplication extends Configured implements Tool {
    
        public static void main(String[] args) throws Exception {
            System.exit(ToolRunner.run(new HiveTestApplication(), args));
        }
    
        @Override
        public int run(String[] args) throws Exception {
            Configuration conf = new Configuration();
            //set your conf
            Job job = Job.getInstance(conf);
            job.setMapperClass(HiveMapperExample.class);
    
            addHiveDelegationToken(job.getCredentials(), "jdbc:hive2://{host}:10000/", "hive/_HOST@REALM");
    
            job.waitForCompletion(true);
    
            return 0;
        }
    
    
        public void addHiveDelegationToken(Credentials creds, String url, String principal) throws Exception {
            Class.forName("org.apache.hive.jdbc.HiveDriver");
    
            Connection con = DriverManager.getConnection(url + ";principal=" + principal);
            // get delegation token for the given proxy user
            String tokenStr = ((HiveConnection) con).getDelegationToken(UserGroupInformation.getCurrentUser().getShortUserName(), principal);
            con.close();
    
            Token<DelegationTokenIdentifier> hive2Token = new Token<>();
            hive2Token.decodeFromUrlString(tokenStr);
            creds.addToken(new Text("hive.server2.delegation.token"), hive2Token);
            creds.addToken(new Text(HiveAuthFactory.HS2_CLIENT_TOKEN), hive2Token);
        }
    }
    

关于hadoop - 使用 Kerberos 安全连接到 Mapper 中的 HIVE,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33255350/

相关文章:

hadoop - Apache PIG-错误org.apache.pig.impl.PigContext-在第1行第1列遇到 “<OTHER> ”,= “”

apache-spark - 在Hive中,如果选择数据时缺少外部表分区位置数据,该如何解决错误?

hive - 在配置单元中将日期从字符串更改为日期类型

windows - 更改 kerberos 票证缓存位置

hadoop - CDH4.4 : Restarting HDFS and MapReduce from shell

java - 从 NetCDF 4.5 Grib2Record 中提取天气预报数据

java - 哪个key类适合二次排序?

java - Hadoop Reducer代码parseint命令错误

java - 如何使用map reduce获得前两个元素?

hadoop - 任何人都可以澄清 HIVE 0.14 上的查询吗