2016-03-23 66 views
2

我在一個hadoop集羣hortonworks 2.4發行版上工作。 我想在Hive表上進行ACID操作。這是我的聲明聲明:Hadoop Hive ACID查詢錯誤

CREATE TABLE myAcidTable (..) 
CLUSTERED BY(myKey) INTO 1 BUCKETS 
STORED AS ORC TBLPROPERTIES ('transactional'='true','orc.compress'='SNAPPY'); 

我使用具有相同結構的外部Hive表填充此表。

INSERT INTO myAcidTable 
SELECT * FROM MyTmpTable; 

此操作效果很好:

Loading data to table MyAcidTable
Table myAcidTable stats: [numFiles=1, numRows=4450, totalSize=42001, rawDataSize=0]
OK

我試着通過蜂巢外殼來查詢該表:

set hive.support.concurrency=true; 
set hive.enforce.bucketing=true; 
set hive.exec.dynamic.partition.mode=nonstrict; 
set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager; 
set hive.compactor.initiator.on=true; 
set hive.compactor.worker.threads=3; 

SELECT * FROM myAcidTable 
WHERE myKey = 12; 

但我有這個錯誤(即使狀態似乎是OK ):

OK
Failed with exception java.io.IOException:java.lang.RuntimeException: serious problem

當我看通過日誌,我覺得這一點:

org.apache.ambari.view.hive.client.HiveErrorStatusException: H170 Unable to fetch results. java.io.IOException: java.lang.RuntimeException: serious problem

...
Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: delta_0000000_0000000 does not start with base_

這很奇怪,因爲當我宣佈我的表,而事務propoerty,select語句效果很好

CREATE TABLE myAcidTable (..) 
CLUSTERED BY(myKey) INTO 1 BUCKETS 
STORED AS ORC TBLPROPERTIES ('orc.compress'='SNAPPY'); 

SELECT * FROM myAcidTable 
WHERE myKey = 12; 

結果:

OK
12 ...

你有什麼想法在哪裏看?謝謝您的幫助。

完整的錯誤:

org.apache.hive.service.cli.HiveSQLException: java.io.IOException: java.lang.RuntimeExce

ption: serious problem at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:352) at org.apache.hive.service.cli.operation.OperationManager.getOperationNextRowSet(OperationManager.java:223) at org.apache.hive.service.cli.session.HiveSessionImpl.fetchResults(HiveSessionImpl.java:716) at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) at com.sun.proxy.$Proxy22.fetchResults(Unknown Source) at org.apache.hive.service.cli.CLIService.fetchResults(CLIService.java:454) at org.apache.hive.service.cli.thrift.ThriftCLIService.FetchResults(ThriftCLIService.java:672) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1557) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1542) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: java.lang.RuntimeException: serious problem at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:512) at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:419) at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:143) at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1737) at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:347) ... 24 more Caused by: java.lang.RuntimeException: serious problem at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.generateSplitsInfo(OrcInputFormat.java:1115) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getSplits(OrcInputFormat.java:1142) at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:367) at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:299) at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:450) ... 28 more Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: delta_0000000_0000000 does not start with base_ at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:192) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.generateSplitsInfo(OrcInputFormat.java:1092) ... 32 more Caused by: java.lang.IllegalArgumentException: delta_0000000_0000000 does not start with base_ at org.apache.hadoop.hive.ql.io.AcidUtils.parseBase(AcidUtils.java:154) at org.apache.hadoop.hive.ql.io.AcidUtils.parseBaseBucketFilename(AcidUtils.java:182) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$FileGenerator.call(OrcInputFormat.java:725) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$FileGenerator.call(OrcInputFormat.java:690) at java.util.concurrent.FutureTask.run(FutureTask.java:266) ... 3 more

+0

您是否在創建表之前而不是在查詢之前嘗試設置這些值? –

+0

是的,我也將這些值更改爲配置文件,沒有更改.. – Mouette

回答

0

這可能是因爲您在創建表或將數據加載到表的時間有錯誤的事務管理器。

就我而言,我有org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager

,而不是org.apache.hadoop.hive.ql.lockmgr.DbTxnManager

爲了擺脫錯誤的,你需要刪除表,設置正確的事務管理器即

hive> set hive.txn.manager = org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;

然後重新創建表。