2016-01-19 108 views
1

我正在使用Hive 1.2.1和Spark 1.6,問題是我無法在使用Spark Shell的Hive表中執行簡單的刪除操作。由於配置單元支持自0.14以來的ACID,我希望它可以在Spark中被允許。使用Spark從Hive表中刪除

16/01/19 12:44:24 INFO hive.metastore: Connected to metastore. 


scala> hiveContext.sql("delete from testdb.test where id=2"); 


16/01/19 12:44:51 INFO parse.ParseDriver: Parsing command: delete from  
testdb.test where id=2 
16/01/19 12:44:52 INFO parse.ParseDriver: Parse Completed 

org.apache.spark.sql.AnalysisException: 
Unsupported language features in query: delete from testdb.test where id=2 
TOK_DELETE_FROM 1, 0,12, 12 
    TOK_TABNAME 1, 4,6, 12 
    testdb 1, 4,4, 12 
    test 1, 6,6, 19 
    ...... 

scala.NotImplementedError: No parse rules for TOK_DELETE_FROM: 
TOK_DELETE_FROM 1, 0,12, 12 
TOK_TABNAME 1, 4,6, 12 
    testdb 1, 4,4, 12 
    ...... 

回答

1

您可以通過Scala內部的命令行運行Hive。

import scala.sys.process._ 
val cmd = "hive -e \"delete from testdb.test where id=2\"" // Your command 
val output = cmd.!! // Captures the output 

Execute external command

+0

您提供是正確的,在命令行中運行的命令見,但它給了當從火花shell中運行一個ParseException。 FAILED:ParseException行1:3無法識別switch數據庫語句中''''''附近的輸入 java.lang.RuntimeException:非零退出值:64 – sparkDabbler