2017-05-15 85 views
0

我使用IntelliJ(sbt項目)創建了以下測試Scala程序。無法構建一個Scala程序「sbt包」失敗,線程中出現異常「main」java.sql.SQLException:沒有合適的驅動程序

import org.apache.spark.SparkContext 
import org.apache.spark.SparkContext._ 
import org.apache.spark.SparkConf 
import java.sql._ 

object ConnTest extends App { 
    val conf = new SparkConf() 
    val sc = new SparkContext(conf.setAppName("Test").setMaster("local[*]")) 
    val sqlContext = new org.apache.spark.sql.SQLContext(sc) 
    val jdbcSqlConn = "jdbc:sqlserver://...;databaseName=...;user=...;password=...;" 
    val jdbcDf = sqlContext.read.format("jdbc").options(Map(
    "url" -> jdbcSqlConn, 
    "dbtable" -> "table1" 
)).load() 
    jdbcDf.show(10) 

    sc.stop() 
} 

但是,sbt package出現以下錯誤。我已經從Microsoft網站下載MS Sql服務器驅動程序(C:\sqljdbc_6.0\enu\jre8\sqljdbc42.jar)。如何在sbt項目中設置jar引用?

 
Exception in thread "main" java.sql.SQLException: No suitable driver 
     at java.sql.DriverManager.getDriver(Unknown Source) 
     at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84) 
     at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84) 
     at scala.Option.getOrElse(Option.scala:121) 
     at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.(JDBCOptions.scala:83) 
     at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.(JDBCOptions.scala:34) 
     at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32) 
     at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:330) 
     at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152) 
     at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125) 
     at ConnTest$.delayedEndpoint$ConnTest$1(main.scala:14) 
     at ConnTest$delayedInit$body.apply(main.scala:6) 
     at scala.Function0$class.apply$mcV$sp(Function0.scala:34) 
     at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12) 
     at scala.App$$anonfun$main$1.apply(App.scala:76) 
     at scala.App$$anonfun$main$1.apply(App.scala:76) 
     at scala.collection.immutable.List.foreach(List.scala:381) 
     at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35) 
     at scala.App$class.main(App.scala:76) 
     at ConnTest$.main(main.scala:6) 
     at ConnTest.main(main.scala) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) 
     at java.lang.reflect.Method.invoke(Unknown Source) 
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743) 
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) 
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) 
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) 
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
+0

將jar放在lib文件夾中 – prayagupd

+0

我的項目文件夾樹中沒有名稱爲「lib」的文件夾?有一個根文件夾「External Libraries」,包含兩個子文件夾「< 1.8 >」和「scala-sdk-2.12.1」。 – ca9163d9

+0

'External Libraries'是intellij術語,通過命令行或intellij創建一個文件夾'lib',然後複製你想要的jar文件。 – prayagupd

回答

1

put any external jars in lib folder,創建lib如果不存在(mkdir -p lib)。

build.sbt 
lib/ 
    sqljdbc42.jar 
project/ 
src/ 

另一種方式可以發佈罐子您的常春藤回購(~/.ivy

然後,你可以簡單地去sbt console和驗證罐子裝。

+0

我開始使用'spark-shell --driver-class-path C:\ sqljdbc_6.0 \ enu \ jre8 \ sqljdbc42.jar '。在我添加了lib \ .... jar後重建程序後,是否還需要指定'--driver-class-path'? – ca9163d9

+0

順便說一句,可以在添加lib \ ... jar文件後構建scala代碼。但是,'spark-submit.cmd --class ConnTest --master local [4]。\ target \ scala-2.11 \ mpa_2.11-1.0.jar'仍然會出現相同的錯誤? – ca9163d9

相關問題