1

我運行下面的命令:星火SQL - 登記的臨時表中找不到

spark-shell --packages datastax:spark-cassandra-connector:1.6.0-s_2.10 

然後我停下範圍內同:

sc.stop 

然後我運行這段代碼在REPL:

val conf = new org.apache.spark.SparkConf(true).set("spark.cassandra.connection.host", "127.0.0.1") 
val sc = new org.apache.spark.SparkContext(conf) 
val sqlContext = new org.apache.spark.sql.SQLContext(sc) 
val cc = new org.apache.spark.sql.cassandra.CassandraSQLContext(sc) 

cc.setKeyspace("ksp") 

cc.sql("SELECT * FROM continents").registerTempTable("conts") 

val allContinents = sqlContext.sql("SELECT * FROM conts").collect 

我得到:

org.apache.spark.sql.AnalysisException: Table not found: conts; 

在Cassandra中定義的keyspace ksp和表continents,所以我懷疑這個錯誤不是來自那邊。

(火花1.6.0,1.6.1)

回答

0

因爲你用不同的上下文來創建數據幀和執行SQL。

val conf = new 
org.apache.spark.SparkConf(true).set("spark.cassandra.connection.host", "127.0.0.1") 
val sc = new org.apache.spark.SparkContext(conf) 
val sqlContext = new org.apache.spark.sql.SQLContext(sc) 
val cc = new org.apache.spark.sql.cassandra.CassandraSQLContext(sc) 

cc.setKeyspace("ksp") 

cc.sql("SELECT * FROM continents").registerTempTable("conts") 

// use cc instead of sqlContext 
val allContinents = cc.sql("SELECT * FROM conts").collect