Hadoop Hive ACID 查询错误

标签 hadoop hive hql

我在 hadoop 集群上工作,hortonworks 2.4 发行版。 我想对 Hive 表进行 ACID 操作。这是我的声明:

CREATE TABLE myAcidTable (..)
CLUSTERED BY(myKey) INTO 1 BUCKETS
STORED AS ORC TBLPROPERTIES ('transactional'='true','orc.compress'='SNAPPY');

我根据具有相同结构的外部 Hive 表填充此表。

INSERT INTO myAcidTable
SELECT * FROM MyTmpTable;

这个操作效果很好:

Loading data to table MyAcidTable
Table myAcidTable stats: [numFiles=1, numRows=4450, totalSize=42001, rawDataSize=0]
OK

我尝试通过配置单元 shell 查询此表:

set hive.support.concurrency=true;
set hive.enforce.bucketing=true;
set hive.exec.dynamic.partition.mode=nonstrict;
set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;
set hive.compactor.initiator.on=true;
set hive.compactor.worker.threads=3;

SELECT * FROM myAcidTable
WHERE myKey = 12;

但我有这个错误(即使状态似乎没问题):

OK
Failed with exception java.io.IOException:java.lang.RuntimeException: serious problem

当我查看日志时,我发现:

org.apache.ambari.view.hive.client.HiveErrorStatusException: H170 Unable to fetch results. java.io.IOException: java.lang.RuntimeException: serious problem

...
Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: delta_0000000_0000000 does not start with base_

这很奇怪,因为当我在没有事务属性的情况下声明我的表时,select 语句运行良好

CREATE TABLE myAcidTable (..)
CLUSTERED BY(myKey) INTO 1 BUCKETS
STORED AS ORC TBLPROPERTIES ('orc.compress'='SNAPPY');

SELECT * FROM myAcidTable
WHERE myKey = 12;

结果:

OK
12 ...

你知道去哪里找吗?感谢您的帮助。

完整错误:

org.apache.hive.service.cli.HiveSQLException: java.io.IOException: java.lang.RuntimeExce

ption: serious problem at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:352) at org.apache.hive.service.cli.operation.OperationManager.getOperationNextRowSet(OperationManager.java:223) at org.apache.hive.service.cli.session.HiveSessionImpl.fetchResults(HiveSessionImpl.java:716) at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) at com.sun.proxy.$Proxy22.fetchResults(Unknown Source) at org.apache.hive.service.cli.CLIService.fetchResults(CLIService.java:454) at org.apache.hive.service.cli.thrift.ThriftCLIService.FetchResults(ThriftCLIService.java:672) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1557) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1542) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: java.lang.RuntimeException: serious problem at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:512) at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:419) at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:143) at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1737) at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:347) ... 24 more Caused by: java.lang.RuntimeException: serious problem at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.generateSplitsInfo(OrcInputFormat.java:1115) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getSplits(OrcInputFormat.java:1142) at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:367) at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:299) at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:450) ... 28 more Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: delta_0000000_0000000 does not start with base_ at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:192) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.generateSplitsInfo(OrcInputFormat.java:1092) ... 32 more Caused by: java.lang.IllegalArgumentException: delta_0000000_0000000 does not start with base_ at org.apache.hadoop.hive.ql.io.AcidUtils.parseBase(AcidUtils.java:154) at org.apache.hadoop.hive.ql.io.AcidUtils.parseBaseBucketFilename(AcidUtils.java:182) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$FileGenerator.call(OrcInputFormat.java:725) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$FileGenerator.call(OrcInputFormat.java:690) at java.util.concurrent.FutureTask.run(FutureTask.java:266) ... 3 more

最佳答案

这可能是因为您在创建表或将数据加载到表中时使用了错误的事务管理器。

在我的例子中,我有 org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager

代替 org.apache.hadoop.hive.ql.lockmgr.DbTxnManager

要消除错误,您需要删除表,设置正确的事务管理器即

hive> 设置 hive.txn.manager = org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;

然后重新创建表。

关于Hadoop Hive ACID 查询错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36178633/

相关文章:

hadoop - hive中同一个目录下如何存储多个文件?

sql - Hive 查询等价于 sql

java - HQL 查询未删除托管对象

java - 为什么使用 maven shade 插件重定位不起作用?

hadoop - 无法从 hive 加载 hbase 表中的数据

java - Hadoop - 未加载 IntWritable(int) 构造函数

java - HQL相关: How to create an ailias to an object instead a field

hadoop - 清除hdfs中的/tmp目录

oracle - 是否可以将配置单元表与 oracle 表连接起来?

sql - 根据 Hive 中 2 个源表的一些规则更新目标中的 "flag"