2014-05-22 158 views
0

我在使用jdbc連接的hive(v 0.11)上運行查詢。代碼如下:Hive查詢失敗INSERT OVERWRITE

Connection con = DriverManager.getConnection(
       "jdbc:hive://192.168.1.10:10000", "", ""); 
Statement stmt = con.createStatement(); 
stmt.execute("some query"); 

它成功運行以下查詢:

CREATE TABLE testdb.test(name string,id int); 

SELECT * FROM testdb.test; 

但是失敗上執行包含INSERT OVERWRITE子句的任何查詢。例如:

INSERT OVERWRITE DIRECTORY '/user/jim/dir' SELECT * FROM space.test; 

INSERT OVERWRITE TABLE testdb.t2 select name,id from testdb.test; 

與以下跟蹤:

java.sql.SQLException: Query returned non-zero code: 1, cause: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask 
at org.apache.hadoop.hive.jdbc.HivePreparedStatement.executeImmediate(HivePreparedStatement.java:178) 
at org.apache.hadoop.hive.jdbc.HivePreparedStatement.executeQuery(HivePreparedStatement.java:141) 
at my.pack.test.HiveTest.main(HiveTest.java:31) 
    Caused by: HiveServerException(message:Query returned non-zero code: 1, cause: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask, errorCode:1, SQLState:08S01) 
at org.apache.hadoop.hive.service.ThriftHive$execute_result$execute_resultStandardScheme.read(ThriftHive.java:1494) 
at org.apache.hadoop.hive.service.ThriftHive$execute_result$execute_resultStandardScheme.read(ThriftHive.java:1480) 
at org.apache.hadoop.hive.service.ThriftHive$execute_result.read(ThriftHive.java:1430) 
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78) 
at org.apache.hadoop.hive.service.ThriftHive$Client.recv_execute(ThriftHive.java:116) 
at org.apache.hadoop.hive.service.ThriftHive$Client.execute(ThriftHive.java:103) 
at org.apache.hadoop.hive.jdbc.HivePreparedStatement.executeImmediate(HivePreparedStatement.java:176) 
... 2 more 

的主要問題是,這些查詢可以從蜂巢控制檯成功執行。

如果我在這裏失去了一些東西,請幫助任何人。或者有更好的方法來實現這與jdbc?

N.B. - 上述塊中的每個查詢都單獨執行,不帶分號。我只是把它們放在易讀性方面。

回答

2

您好我想你的例子情況下,它的工作,這樣使用而執行的JDBC客戶端查詢:

String sql = "INSERT OVERWRITE DIRECTORY '/user/jim/dir' select * from " + tableName; 

stmt.execute(sql); 

注:

  1. 確保/用戶/吉姆/ dir是可寫的,如果不使其可寫爲

    Hadoop的FS -chmod A + RWX /用戶/吉姆/ DIR

  2. 使用stmt.execute(sql)不是stmt.executeQuery(sql);

PS:問題依然存在意味着讓我知道,將分享完整的代碼。

+0

雅應該工作。我已經嘗試了只有失敗的chmod。看起來像我正面臨一些其他配置相關的問題。不管怎樣,謝謝。 – blackSmith