java - 无法通过 jdbc 连接到配置单元

标签 java hadoop jdbc hive impala

我使用 gradle 来运行我的程序。示例代码引用https://github.com/onefoursix/Cloudera-Impala-JDBC-Example .

而且我可以从 repo 中运行示例而不会失败。

// Apply the java plugin to add support for Java
apply plugin: 'java'
apply plugin: 'application'

mainClassName = "com.my.impala.fetcher.Fetcher"
// In this section you declare where to find the dependencies of your project
repositories {
    // Use 'jcenter' for resolving your dependencies.
    // You can declare any Maven/Ivy/file repository here.
    mavenCentral()
    maven {
        url "https://repository.cloudera.com/artifactory/cloudera-repos/"
    }
}

run {
     if (project.hasProperty("params")) {
         args params
         // args Eval.me(params)
     }
}

test {
        testLogging {
            events "passed", "skipped", "failed", "standardOut", "standardError"
    }
}


// In this section you declare the dependencies for your production and test code
dependencies {
    // The production code uses the SLF4J logging API at compile time
    compile 'org.slf4j:slf4j-api:1.7.7'
    compile 'org.apache.hadoop:hadoop-client:2.6.0-cdh5.4.4.1'
    compile 'org.json:json:20140107'
    compile 'org.apache.hive:hive-jdbc:1.1.0'
}

示例代码如下。

Connection con = null;
ResultSet rs = null;
try {
    Class.forName(JDBC_DRIVER_NAME);
    con = DriverManager.getConnection(CONNECTION_HOST);
    Statement stmt = con.createStatement();
    rs = stmt.executeQuery(sqlStatement);
} catch (SQLException e) {
    e.printStackTrace();
    System.exit(-1);
} catch (Exception e) {
    e.printStackTrace();
    System.exit(-1);
} finally {
    try {
        con.close();
    } catch (Exception e) {
        e.printStackTrace();
        System.exit(-1);
    }
}

rs.next();

其中rs.next会抛出如下异常

org.apache.thrift.transport.TTransportException: Cannot write to null outputStream
    at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:142)
    at org.apache.thrift.protocol.TBinaryProtocol.writeI32(TBinaryProtocol.java:178)
    at org.apache.thrift.protocol.TBinaryProtocol.writeMessageBegin(TBinaryProtocol.java:106)
    at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
    at org.apache.hive.service.cli.thrift.TCLIService$Client.send_FetchResults(TCLIService.java:495)
    at org.apache.hive.service.cli.thrift.TCLIService$Client.FetchResults(TCLIService.java:487)
    at org.apache.hive.jdbc.HiveQueryResultSet.next(HiveQueryResultSet.java:360)
    at com.my.impala.fetcher.Fetcher.main(Fetcher.java:54)
Exception in thread "main" java.sql.SQLException: Error retrieving next row
    at org.apache.hive.jdbc.HiveQueryResultSet.next(HiveQueryResultSet.java:388)
    at com.my.impala.fetcher.Fetcher.main(Fetcher.java:54)
Caused by: org.apache.thrift.transport.TTransportException: Cannot write to null outputStream
    at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:142)
    at org.apache.thrift.protocol.TBinaryProtocol.writeI32(TBinaryProtocol.java:178)
    at org.apache.thrift.protocol.TBinaryProtocol.writeMessageBegin(TBinaryProtocol.java:106)
    at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
    at org.apache.hive.service.cli.thrift.TCLIService$Client.send_FetchResults(TCLIService.java:495)
    at org.apache.hive.service.cli.thrift.TCLIService$Client.FetchResults(TCLIService.java:487)
    at org.apache.hive.jdbc.HiveQueryResultSet.next(HiveQueryResultSet.java:360)
    ... 1 more

我不确定我错过了哪一部分。

谢谢。

最佳答案

你已经关闭了连接,当然无法读取结果。 rs.next() 应该在 executeQuery() 行之后,在 try block 的末尾,如下所示:

rs = stmt.executeQuery(sqlStatement);
while (rs.next()) {
    // handle the record
}

ResultSet是一个数据库游标,它不包含整个数据集,您可以通过它访问数据库中的数据。

关于java - 无法通过 jdbc 连接到配置单元,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32761816/

相关文章:

java - 带标题的 JSeparator

java - 核心java打印蝴蝶结构

java - 使用注释的 Hibernate 枚举映射

java - 从 JDBC 触发 SQL 命令时出错 - java.sql.SQLSyntaxErrorException

java - 在java应用程序中将单词映射到其同义词

java - 可能因不同原因而失败的方法的标准模式是什么?

hadoop - 如何扩展行中的数组值!!使用配置单元 SQL

hadoop - Hive 到 Redshift 日期/时间转换

java - Nutch hadoop Map减少Java堆空间outOfMemory

Java DAO模式多表原子事务