2013-11-21 78 views
0

親愛的朋友們,我可以配置Presto和Hive。prestodb配置單元sql查詢錯誤

我可以看到「SHOW TABLES」的結果。結果中可以看到「書籍」表。

此外,還描述了顯示所有欄目詳細信息的書籍。

我確實有一個「books」表 - 能夠通過配置單元查詢並查看結果。

例如:hive> select * from books;

但當我嘗試通過presto。我收到以下錯誤

請指導我

錯誤

presto:default> select * from books; 

Query 20131121_025845_00004_qqe25, FAILED, 1 node 
Splits: 1 total, 0 done (0.00%) 
0:00 [0 rows, 0B] [0 rows/s, 0B/s] 

Query 20131121_025845_00004_qqe25 failed: java.io.IOException: Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "ubuntu/192.168.56.101"; destination host is: "localhost":54310; 
presto:default> 

例外服務器上


45_00004_qqe25.1 
java.lang.RuntimeException: java.io.IOException: Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "ubuntu/192.168.56.101"; destination host is: "localhost":54310; 
     at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-15.0.jar:na] 
     at com.facebook.presto.hive.HiveSplitIterable$HiveSplitQueue.computeNext(HiveSplitIterable.java:433) ~[na:na] 
     at com.facebook.presto.hive.HiveSplitIterable$HiveSplitQueue.computeNext(HiveSplitIterable.java:392) ~[na:na] 
     at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143) ~[guava-15.0.jar:na] 
     at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138) ~[guava-15.0.jar:na] 
     at com.facebook.presto.execution.SqlStageExecution.startTasks(SqlStageExecution.java:463) [presto-main-0.52.jar:0.52] 
     at com.facebook.presto.execution.SqlStageExecution.access$300(SqlStageExecution.java:80) [presto-main-0.52.jar:0.52] 
     at com.facebook.presto.execution.SqlStageExecution$5.run(SqlStageExecution.java:435) [presto-main-0.52.jar:0.52] 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [na:1.7.0_45] 
     at java.util.concurrent.FutureTask.run(FutureTask.java:262) [na:1.7.0_45] 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_45] 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_45] 
     at java.lang.Thread.run(Thread.java:744) [na:1.7.0_45] 
Caused by: java.io.IOException: Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "ubuntu/192.168.56.101"; destination host is: "localhost":54310; 
     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763) ~[na:na] 
     at org.apache.hadoop.ipc.Client.call(Client.java:1229) ~[na:na] 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202) ~[na:na] 
     at com.sun.proxy.$Proxy155.getListing(Unknown Source) ~[na:na] 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_45] 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_45] 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_45] 
     at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_45] 
     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) ~[na:na] 
     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) ~[na:na] 
     at com.sun.proxy.$Proxy155.getListing(Unknown Source) ~[na:na] 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:441) ~[na:na] 
     at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1526) ~[na:na] 
     at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1509) ~[na:na] 
     at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406) ~[na:na] 
     at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1462) ~[na:na] 
     at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1502) ~[na:na] 
     at com.facebook.presto.hive.ForwardingFileSystem.listStatus(ForwardingFileSystem.java:298) ~[na:na] 
     at com.facebook.presto.hive.ForwardingFileSystem.listStatus(ForwardingFileSystem.java:298) ~[na:na] 
     at com.facebook.presto.hive.FileSystemWrapper$3.listStatus(FileSystemWrapper.java:146) ~[na:na] 
     at org.apache.hadoop.fs.FileSystem$4.<init>(FileSystem.java:1778) ~[na:na] 
     at org.apache.hadoop.fs.FileSystem.listLocatedStatus(FileSystem.java:1777) ~[na:na] 
     at org.apache.hadoop.fs.FileSystem.listLocatedStatus(FileSystem.java:1760) ~[na:na] 
     at com.facebook.presto.hive.util.AsyncRecursiveWalker$1.run(AsyncRecursiveWalker.java:58) ~[na:na] 
     at com.facebook.presto.hive.util.SuspendingExecutor$1.run(SuspendingExecutor.java:67) ~[na:na] 
     at com.facebook.presto.hive.util.BoundedExecutor.executeOrMerge(BoundedExecutor.java:82) ~[na:na] 
     at com.facebook.presto.hive.util.BoundedExecutor.access$000(BoundedExecutor.java:41) ~[na:na] 
     at com.facebook.presto.hive.util.BoundedExecutor$1.run(BoundedExecutor.java:53) ~[na:na] 
     ... 3 common frames omitted 
Caused by: java.io.IOException: Broken pipe 
     at sun.nio.ch.FileDispatcherImpl.write0(Native Method) ~[na:1.7.0_45] 
     at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) ~[na:1.7.0_45] 
     at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) ~[na:1.7.0_45] 
     at sun.nio.ch.IOUtil.write(IOUtil.java:65) ~[na:1.7.0_45] 
     at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:487) ~[na:1.7.0_45] 
     at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:62) ~[na:na] 
     at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:143) ~[na:na] 
     at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:153) ~[na:na] 
     at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:114) ~[na:na] 
     at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) ~[na:1.7.0_45] 
     at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) ~[na:1.7.0_45] 
     at java.io.DataOutputStream.flush(DataOutputStream.java:123) ~[na:1.7.0_45] 
     at org.apache.hadoop.ipc.Client$Connection$3.run(Client.java:897) ~[na:na] 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [na:1.7.0_45] 
     at java.util.concurrent.FutureTask.run(FutureTask.java:262) [na:1.7.0_45] 
     ... 3 common frames omitted 
2013-11-20T21:58:45.915-0500 DEBUG task-notification-1  com.facebook.presto.execution.TaskStateMachine Task 20131121_025845_00004_qqe25.0.0 is CANCELED 
+0

與這個問題我認爲你已經解決了你以前的問題:http://stackoverflow.com/questions/19957500/how-to-use-presto-to-query-hive-data 請發表您的上一個問題你如何解決它並將其標記爲答案。 – eLRuLL

+0

Dain Sundstrom \t 11月8日 這是來自HDFS客戶端(org.apache.hadoop.ipc.Client:941在我的代碼中)的錯誤,並且在快速查看該代碼後,它看起來像這意味着客戶端可能不解析服務器響應。我的猜測是我們與presto-hive-cdh4插件捆綁在一起的客戶端與您的Hadoop版本不兼容。此代碼包含Cloudera Hadoop版本2.0.0-cdh4.3.0。你使用什麼版本? – user2020099

回答

0

我從谷歌組一些答案 https://groups.google.com/forum/#!topic/presto-users/lVLvMGP1sKE

戴恩特倫
11月8日 也就是說從HDFS客戶端中的錯誤(org.apache.hadoop.ipc.Client:941在我的代碼)和該代碼的快速審查後,它看起來像這樣意味着客戶端無法解析服務器響應。我的猜測是我們與presto-hive-cdh4插件捆綁在一起的客戶端與您的Hadoop版本不兼容。此代碼包含Cloudera Hadoop版本2.0.0-cdh4.3.0。你使用什麼版本?