2015-12-21 63 views
-1

我正在使用sbt創建一個使用sbt的SQLContext的scala程序。
這是我build.sbt:如何在使用scala的spark中創建SQLContext

name := "sampleScalaProject" 

version := "1.0" 

scalaVersion := "2.11.7" 
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.5.2" 
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.5.2" 
libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.8.2.2" 
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "1.5.2" 
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "1.5.2" 
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0" 

這是測試程序:

import org.apache.spark.SparkContext 
import org.apache.spark.sql.SQLContext 

object SqlContextSparkScala { 

def main (args: Array[String]) { 
    val sc =SparkContext 
    val sqlcontext = new SQLContext(sc) 
    } 

} 

我得到以下錯誤:

Error:(8, 26) overloaded method constructor SQLContext with alternatives: 
    (sparkContext: org.apache.spark.api.java.JavaSparkContext)org.apache.spark.sql.SQLContext <and> 
    (sparkContext: org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext 
cannot be applied to (org.apache.spark.SparkContext.type) 
    val sqlcontexttest = new SQLContext(sc) 

有誰能夠懇求讓我知道這個問題因爲我是非常新的Scala和火花編程..提前感謝 ^

回答

3

您需要newSparkContext,並應解決它

+0

感謝Justin..It這麼簡單.. – Aman

-1
val conf = new SparkConf().setAppName("SparkJoins").setMaster("local") 
val sc = new SparkContext(conf); 
val sqlContext = new org.apache.spark.sql.SQLContext(sc);  
相關問題