2017-02-15 50 views
0

我在Scala中遇到了Module not found錯誤。我試圖獲得與Oracle的jdbc連接,加入兩個表然後將其打印出來。Scala Oracle JDBC

我的斯卡拉文件

import org.apache.spark.SparkContext 
import org.apache.spark.SparkContext._ 
import org.apache.spark.SparkConf 
import org.apache.spark.sql.SQLContext 

object sparkJDBC { 
    def main(args: Array[String]): Unit = { 
    val conf = new SparkConf().setAppName("Simple  
     Application").setMaster("local[2]").set("spark.executor.memory","1g") 
    val sc = new SparkContext(conf) 
    var sqlContext = new SQLContext(sc) 
    val chrttype = sqlContext.load("jdbc", 
     Map("url" -> "jdbc:oracle:thin:gductv1/[email protected]//localhost:1521/XE", 
     "dbtable" -> "chrt_typ")) 
    val clntlvl1 = sqlContext.load("jdbc", 
     Map("url" -> "jdbc:oracle:thin:gductv1/[email protected]//localhost:1521/XE", 
     "dbtable" -> "clnt_lvl1")) 
    val join2 = 
     chrttyp.join(clntlvl1,chrttyp.col("chrt_typ_key")===clntlvl1("lvl1_key")) 
    join2.foreach(println) 
    join2.printSchema() 
    } 
} 

我build.sbt文件

name := "sparkJDBC" 
    version := "0.1" 
    scalaVersion := "2.11.7" 

    libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1" 
    libraryDependencies += "org.apache.tika" % "tika-core" % "1.11" 
    libraryDependencies += "org.apache.tika" % "tika-parsers" % "1.11" 
    libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.7.1" 
    libraryDependencies += "org.apache.spark" % "spark-sql" % "1.0.0" 

錯誤文件是

[warn] module not found: org.apache.spark#spark-sql;1.0.0 
[warn] ==== local: tried 
[warn] C:\Users\.ivy2\local\org.apache.spark\spark-sql\1.0.0\ivys\ivy.xml 
[warn] ==== public: tried 
[warn] https://repo1.maven.org/maven2/org/apache/spark/spark-sql/1.0.0/spark-sql-1.0.0.pom 
[info] Resolving jline#jline;2.12.1 ... 
[warn] :::::::::::::::::::::::::::::::::::::::::::::: 
[warn] ::   UNRESOLVED DEPENDENCIES   :: 
[warn] :::::::::::::::::::::::::::::::::::::::::::::: 
[warn] :: org.apache.spark#spark-sql;1.0.0: not found 
[warn] :::::::::::::::::::::::::::::::::::::::::::::: 

[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-sql;1.0.0: not found 

請幫我找出是什麼導致了這一點。

+0

問題:當前的解決沒有你問 –

回答

0

爲了確保有一個正確的依賴,你可以使用網站像mvnrepository:https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10/1.0.0

libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "1.0.0" 
+0

這是一個偉大的指針托馬斯的依賴。我沒有'Module not found'問題,但現在出現錯誤[錯誤]模塊已在{file:/ C:/apps/spark-2.1.0/ScalaFiles/中與衝突的跨版本後綴一起解決} scalafil es: [error] org.json4s:json4s-ast _2.11,_2.10 [error] com.twitter:chill _2.11,_2.10 [error] org.json4s:json4s-jackson _2.11,_2.10 [error] org.json4s:json4s-core _2.11,_2.10 [error] org.apache.spark:spark-core _2.11,_2.10 –

+0

我刪除了Scala .sbt文件中的版本參考(scalaVersion:=「2.11.7」),然後它沒有給我任何衝突的交叉版本。現在它給了我一些SQLContext錯誤,但它已經過了第一個障礙。感謝Thomas和Alexey的快速反應。 –