-1
object Test {
def main(args: Array[String]): Unit = {
val spark = SparkSession
.builder()
.appName("Spark SQL Example")
.master("local")
.getOrCreate()
// val peopleDF = spark.read.json("yy/people.json")
//
// peopleDF.write.parquet("people.parquet")
val parquetFileDF = spark.read.parquet("people.parquet")
parquetFileDF.createOrReplaceTempView("parquetFile")
val namesDF = spark.sql("SELECT * FROM parquetFile")
namesDF.show()
val namesDF1 = spark.sql("insert into TABLE parquetFile (idx, name, age) values (200, \"hello\", 78)")
}
}
代碼已經啓動並且下面的代碼輸出!,insert into不能在值之前添加列名。spark-sql解析sql插入錯誤
16/09/12 20:50:22 INFO CodeGenerator: Code generated in 16.608273 ms
+----+---+-------+
| age|idx| name|
+----+---+-------+
|null|100|Michael|
| 30|200| Andy|
| 19|100| Justin|
+----+---+-------+
16/09/12 20:50:22 INFO SparkSqlParser: Parsing command: insert into TABLE parquetFile (idx, name, age) values (200, "hello", 78)
Exception in thread "main" org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input 'idx' expecting {'(', 'SELECT', 'FROM', 'VALUES', 'TABLE', 'INSERT', 'MAP', 'REDUCE'}(line 1, pos 31)
== SQL ==
insert into TABLE parquetFile (idx, name, age) values (200, "hello", 78)
-------------------------------^^^
at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:197)
at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:99)
at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:46)
at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:53)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
at Test$.main(Test.scala:32)
at Test.main(Test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
16/09/12 20:50:22 INFO SparkContext: Invoking stop() from shutdown hook
16/09/12 20:50:22 INFO SparkUI: Stopped Spark web UI at http://10.100.26.199:4040
16/09/12 20:50:22 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/09/12 20:50:22 INFO MemoryStore: MemoryStore cleared
16/09/12 20:50:22 INFO BlockManager: BlockManager stopped
16/09/12 20:50:22 INFO BlockManagerMaster: BlockManagerMaster stopped
16/09/12 20:50:22 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/09/12 20:50:22 INFO SparkContext: Successfully stopped SparkContext
16/09/12 20:50:22 INFO ShutdownHookManager: Shutdown hook called
16/09/12 20:50:22 INFO ShutdownHookManager: Deleting directory /tmp/spark-7229faa1-ed36-4989-a087-eb453e9f9295
Process finished with exit code 1
你好,非常感謝你!我嘗試插入到TABLE parquetFile值(200,\「hello \」,78)可以很好地工作,如果我在sql中添加列名稱,它會拋出此錯誤。 – SailingYang
嗨。對於相同的場景我有完全相同的錯誤。你設法解決它嗎? –