使用 JDBC API 在 hive 中执行插入查询。但查询没有运行。有人可以建议出了什么问题吗?另外,请让我知道如何捕获运行查询时 hive 抛出的错误代码。 Hive版本0.13.0
当我在命令行中运行日志中生成的查询时,它们工作正常。
public static void onSuccess() {
// Write to log on Success
LOGGER.info("Job Succeeded, Updating the " + hiveDB + "." + logTable + " with status SUCCESS");
String insertOnSuccess = "insert into table " + hiveDB + "." + logTable + " select " + currentJobID + "," + "'"
+ startTime + "'" + "," + "'" + stopTime + "'" + "," + runTime + "," + "\'SUCCESS\' from " + hiveDB
+ "." + "dual" + " limit 1; ";
commonDB.InsertToTable(insertOnSuccess);
JobMailer.PostMail("IB Load Successfully completed", "Load completed");
}
public void InsertToTable(String insertquery) {
try {
stm = hiveConn.createStatement();
stm.executeUpdate(insertquery);
} catch (SQLException e) {
LOGGER.error("Running the insert query for :" + insertquery);
} catch (Exception e) {
LOGGER.error(e.getMessage());
} finally {
if (stm != null) {
try {
stm.close();
} catch (SQLException e) {
LOGGER.error(e.getMessage());
;
}
}
}
}
这是我的错误日志:
16/02/12 12:31:09 ERROR hiveconnector.CommonDBUtilities: Running the insert quer y for :insert into table installbase.IB_log select 25,'2016-02-12 12:26:43.037', '2016-02-12 12:31:09.057',22982400,'SUCCESS' from installbase.dual limit 1; 16/02/12 12:31:09 INFO hiveconnector.JobMailer: Sending Mail with :IB Load Succe ssfully completed 16/02/12 12:31:09 INFO hiveconnector.MainApp: Inserted record to the installbase .data_usage_governance_log Table 16/02/12 12:31:10 ERROR hiveconnector.CommonDBUtilities: Running the insert quer y for :Insert into table installbase.data_usage_governance_log select Data_Asset _Reference,File_Name,Origin_System,Transfer_System,'2016-02-12 12:26:43.037',Col umn_Reference,Element_Reference,Rule_Priority,Delete_By_Date,Classification,Geog raphic_Inclusion,Geographic_Restriction,Group_Inclusion,Group_Restriction,Reserv ed from installbase.data_usage_governance_master
最佳答案
太糟糕了,你被 Hive 0.13 困住了,因为......
Starting with Hive 0.14.0, HiveServer2 operation logs are available for Beeline clients. https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-HiveServer2Logging
一旦在服务器端激活日志分派(dispatch),您就可以从 Java 代码中检索这些日志条目 - 可以异步方式检索,也可以在执行结束时整体检索,例如...
private static void DumpHiveMessages (java.sql.Statement stmtGeneric)
{ org.apache.hive.jdbc.HiveStatement stmtExtended ;
try
{ stmtExtended =(org.apache.hive.jdbc.HiveStatement)stmtGeneric ;
for (String sLogMessage : stmtExtended.getQueryLog())
{ JustTraceIt("HIVE SAYS>" +sLogMessage) ; }
if (stmtExtended.hasMoreLogs())
{ JustTraceIt("WARNING>(...log stream still open...") ; }
}
catch (Exception duh)
{ JustTraceIt("WARNING>Error while accessing Hive log stream");
JustTraceIt("WARNING>" +MakeSenseOfDirtyHadoopException(duh)) ;
}
}
这些内容并没有真正记录下来,但是 HiveStatement
的源代码显示了几种非 JDBC 标准方法,例如 getQueryLog
和 hasMoreLogs
-- 还有 Hive 2+ 的 getYarnATSGuid
以及 Hive 3+ 的其他内容。
这是link to the "master" branch on GitHub ,切换到您正在使用的版本(可能是旧的 1.2,以便与 Spark 兼容)。
关于java - 如何使用 JDBC API 捕获 Hive 退出状态或错误代码,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/35356788/