java - 如何在 DSE 中使用 Java API 访问 Hive

标签 java hive datastax-enterprise

我正在 DSE 4.0.3 上使用 2 个 cassandra、2 个 solr 和 1 个 hadoop 节点开发 5 节点集群,我正在尝试通过 java api 连接 Hive。 下面是我尝试执行的程序

import java.sql.SQLException;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.Statement;
import java.sql.DriverManager;

public class HiveJdbcClient {
    private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";

 /**
 * @param args
 * @throws SQLException
  */
 public static void main(String[] args) throws SQLException {
    try {
      Class.forName(driverName);
    } catch (ClassNotFoundException e) {
    // TODO Auto-generated catch block
    e.printStackTrace();
    System.exit(1);
  }
  Connection con = DriverManager.getConnection("jdbc:hive://localhost:10000/default",  "", "");
  Statement stmt = con.createStatement();
  String tableName = "testHiveDriverTable";
  stmt.executeQuery("drop table " + tableName);
  ResultSet res = stmt.executeQuery("create table " + tableName + " (key int, value string)");
 // show tables
  String sql = "show tables '" + tableName + "'";
  System.out.println("Running: " + sql);
  res = stmt.executeQuery(sql);
  if (res.next()) {
    System.out.println(res.getString(1));
  }
  // describe table
  sql = "describe " + tableName;
  System.out.println("Running: " + sql);
  res = stmt.executeQuery(sql);
  while (res.next()) {
    System.out.println(res.getString(1) + "\t" + res.getString(2));
  }

  // load data into table
  // NOTE: filepath has to be local to the hive server
  // NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line
  String filepath = "/tmp/a.txt";
  sql = "load data local inpath '" + filepath + "' into table " + tableName;
  System.out.println("Running: " + sql);
  res = stmt.executeQuery(sql);

  // select * query
  sql = "select * from " + tableName;
  System.out.println("Running: " + sql);
  res = stmt.executeQuery(sql);
  while (res.next()) {
    System.out.println(String.valueOf(res.getInt(1)) + "\t" + res.getString(2));
  }

  // regular hive query
  sql = "select count(1) from " + tableName;
  System.out.println("Running: " + sql);
  res = stmt.executeQuery(sql);
  while (res.next()) {
    System.out.println(res.getString(1));
   }
  }
}

但是我在

时遇到错误
Connection con = DriverManager.getConnection("jdbc:hive://localhost:10000/default", "", "");


Exception in thread "main" java.lang.NoSuchMethodError: org.apache.thrift.protocol.TProtocol.getScheme()Ljava/lang/Class;
at org.apache.hadoop.hive.service.ThriftHive$execute_args.write(ThriftHive.java:1076)
at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:63)
at org.apache.hadoop.hive.service.ThriftHive$Client.send_execute(ThriftHive.java:110)
at org.apache.hadoop.hive.service.ThriftHive$Client.execute(ThriftHive.java:102)
at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:192)
at org.apache.hadoop.hive.jdbc.HiveStatement.execute(HiveStatement.java:132)
at org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:132)
at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:122)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at example.create.HiveTable.main(HiveTable.java:22)

注意:我在运行程序时启动了 thrift 服务器 $ dse hive --service hiveserver

而且我也得到了

Connection con = DriverManager.getConnection("jdbc:hive2://localhost:10000/default", "", "");

Exception in thread "main" java.sql.SQLException: Invalid URL: jdbc:hive2://54.243.203.229:10000/default
at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:86)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at example.create.HiveTable.main(HiveTable.java:22)

注意:我在运行程序时启动了 thrift 服务器 $ dse hive --service hiveserver2

Hite-site.xml 是:

<configuration>
<!--Hive Execution Parameters -->
  <property>
          <name>hive.exec.mode.local.auto</name>
          <value>false</value>
          <description>Let hive determine whether to run in local mode automatically</description>
    </property>
    <property>
          <name>hive.metastore.warehouse.dir</name>
          <value>cfs:///user/hive/warehouse</value>
          <description>location of default database for the warehouse</description>
    </property>
    <property>
           <name>hive.hwi.war.file</name>
           <value>lib/hive-hwi.war</value>
           <description>This sets the path to the HWI war file, relative to${HIVE_HOME}    </description>
     </property>
     <property>
            <name>hive.metastore.rawstore.impl</name>
          <value>com.datastax.bdp.hadoop.hive.metastore.CassandraHiveMetaStore</value>
            <description>Use the Apache Cassandra Hive RawStore implementation</description>
      </property>
      <property>
             <name>hadoop.bin.path</name>
             <value>${dse.bin}/dse hadoop</value>
       </property>
       <!-- Set this to true to enable auto-creation of Cassandra keyspaces as Hive Databases -->
      <property>
            <name>cassandra.autoCreateHiveSchema</name>
           <value>true</value>
       </property>
  </configuration>

任何人都可以提出建议吗?我哪里错了或者遗漏了什么。

最佳答案

WSO2 DSS 3.2.2+ 中存在 libthrift 版本冲突。它们在其部署中包含一个 libthrift jar,并且它会在您放入 Components\lib 目录中的任何内容之前加载。他们有一个更新的 libthrift 和正确的接口(interface)。修复步骤:

进行全新安装。安装路径在此文档中将称为 $home

http://maven.wso2.org/nexus/content/groups/wso2-public/libthrift/wso2/libthrift/0.8.0.wso2v1/下载libthrift-0.8.0.wso2v1.jar

将Windows环境变量CLASSPATH设置为$\home\repository\components\lib(这可能不是必需的)

将 libthrift 0.8 复制到 $home\repository\components\plugins。删除 libthift 0.7 jar

编辑 $home\repository\components\features\org.wso2.carbon.logging.mgt.server_4.2.1。对于 libthrift 行,使其以 version="0.8.0.wso2v2"结尾

编辑 $home\repository\components\features\org.wso2.carbon.databridge.commons.thrift.server_4.2.0 。也更改 libthrift 的版本。

关于java - 如何在 DSE 中使用 Java API 访问 Hive,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/24182593/

相关文章:

java - 取消 Future 并仍然检索其返回值

hadoop - 如何替换配置单元中的字符?

hadoop - hive 查询花费比预期更长的时间

Java包: What is the difference between `oracle.AQ` and `oracle.jdbc.aq` ?

java - 如何增加java中图像的对比度?

java - GraalVM 反射配置中的 Lambda 表达式类

arrays - 从HIVE中的给定开始日期和结束日期创建序列数组

Apache Cassandra 的 C 客户端库

python - 提高 Spark 应用程序的速度

datastax-enterprise - 为什么不在 Hadoop 节点中启用虚拟节点?