我需要加載SparkR中創建的DataFrame才能加載到Hive中。在Hive中加載SparkR數據幀
#created a dataframe df_test
df_test <- createDataFrame(sqlContext, data.frame(mon = c(1,2,3,4,5), year = c(2011,2012,2013,2014,2015)))
#initialized the Hive context
>sc <- sparkR.init()
>hiveContext <- sparkRHive.init(sc)
#used the saveAsTable fn to save dataframe "df_test" in hive table named "table_hive"
>saveAsTable(df_test, "table_hive")
16/08/24 23時08分36秒ERROR RBackendHandler:saveAsTable在13 invokeJava(isStatic = FALSE,OBJID $ ID,方法名,...)失敗 錯誤: 的java.lang .RuntimeException:使用SQLContext創建的表必須是TEMPORARY。改爲使用HiveContext。 at scala.sys.package $ .error(package.scala:27) at org.apache.spark.sql.execution.SparkStrategies $ DDLStrategy $ .apply(SparkStrategies.scala:392) at org.apache.spark。 sql.catalyst.planning.QueryPlanner $$ anonfun $ 1.apply(QueryPlanner.scala:58) at org.apache.spark.sql.catalyst.planning.QueryPlanner $$ anonfun $ 1.apply(QueryPlanner.scala:58) at scala.collection.Iterator $$ anon $ 13.hasNext(Iterator.scala:371) at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59) at org.apache.spark。 sql.execution.QueryExecution.sparkPlan $ lzycompute(QueryExecution.scala:47) at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:45) at org.apache.spark.sql.e xecution.QueryExecution.executedPlan $ lzycompute(QueryExecution.scala:52) 在org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:52) 在org.apache.spark.sql.execution
引發上述錯誤。請幫助。