2016-07-08 96 views
1

求解: prop.setProperty(「driver」,「oracle.jdbc.driver.OracleDriver」)這行必須添加到連接屬性。spark - 線程「main」中的異常java.sql.SQLException:沒有合適的驅動程序

我想在當地午餐點火。我用maven創建了一個依賴關係的jar。

這是我的pom.xml

<?xml version="1.0" encoding="UTF-8"?> 
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 
    <modelVersion>4.0.0</modelVersion> 

    <groupId>com.agildata</groupId> 
    <artifactId>spark-rdd-dataframe-dataset</artifactId> 
    <packaging>jar</packaging> 
    <version>1.0</version> 

    <properties>  
     <exec-maven-plugin.version>1.4.0</exec-maven-plugin.version> 
     <spark.version>1.6.0</spark.version> 
    </properties> 

    <dependencies> 

     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-core_2.11</artifactId> 
      <version>${spark.version}</version> 
     </dependency> 

     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-sql_2.11</artifactId> 
      <version>${spark.version}</version> 
     </dependency> 

     <dependency> 
      <groupId>com.oracle</groupId> 
      <artifactId>ojdbc7</artifactId> 
      <version>12.1.0.2</version> 
     </dependency> 





    </dependencies> 

    <build> 
     <plugins> 

      <plugin> 
       <groupId>org.apache.maven.plugins</groupId> 
       <artifactId>maven-compiler-plugin</artifactId> 
       <version>3.2</version> 
       <configuration> 
        <source>1.8</source> 
        <target>1.8</target> 
       </configuration> 
      </plugin> 

      <plugin> 
       <groupId>net.alchim31.maven</groupId> 
       <artifactId>scala-maven-plugin</artifactId> 
       <executions> 
        <execution> 
         <id>scala-compile-first</id> 
         <phase>process-resources</phase> 
         <goals> 
          <goal>add-source</goal> 
          <goal>compile</goal> 
         </goals> 
        </execution> 
        <execution> 
         <id>scala-test-compile</id> 
         <phase>process-test-resources</phase> 
         <goals> 
          <goal>testCompile</goal> 
         </goals> 
        </execution> 
       </executions> 
      </plugin> 


      <plugin> 
       <groupId>org.apache.maven.plugins</groupId> 
       <artifactId>maven-assembly-plugin</artifactId> 
       <version>2.4.1</version> 
       <configuration> 
        <!-- get all project dependencies --> 
        <descriptorRefs> 
         <descriptorRef>jar-with-dependencies</descriptorRef> 
        </descriptorRefs> 
        <!-- MainClass in mainfest make a executable jar --> 
        <archive> 
         <manifest> 
          <mainClass>example.dataframe.ScalaDataFrameExample</mainClass> 
         </manifest> 
        </archive> 

       </configuration> 
       <executions> 
        <execution> 
         <id>make-assembly</id> 
         <!-- bind to the packaging phase --> 
         <phase>package</phase> 
         <goals> 
          <goal>single</goal> 
         </goals> 
        </execution> 
       </executions> 
      </plugin> 


     </plugins> 
    </build> 

</project> 

我運行mvn package命令和構建是succesfull。之後我嘗試運行像這樣的工作:GMAC:bin gabor_dev$ sh spark-submit --class example.dataframe.ScalaDataFrameExample --master spark://QGMAC.local:7077 /Users/gabor_dev/IdeaProjects/dataframe/target/spark-rdd-dataframe-dataset-1.0-jar-with-dependencies.jar但它拋出這個:Exception in thread "main" java.sql.SQLException: No suitable driver

完整的錯誤消息:

16/07/08 13:09:22 INFO BlockManagerMaster: Registered BlockManager 
Exception in thread "main" java.sql.SQLException: No suitable driver 
    at java.sql.DriverManager.getDriver(DriverManager.java:315) 
    at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$2.apply(JdbcUtils.scala:50) 
    at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$2.apply(JdbcUtils.scala:50) 
    at scala.Option.getOrElse(Option.scala:120) 
    at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.createConnectionFactory(JdbcUtils.scala:49) 
    at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:120) 
    at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:91) 
    at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:222) 
    at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:146) 
    at example.dataframe.ScalaDataFrameExample$.main(ScalaDataFrameExample.scala:30) 
    at example.dataframe.ScalaDataFrameExample.main(ScalaDataFrameExample.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:497) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
16/07/08 13:09:22 INFO SparkContext: Invoking stop() from shutdown hook 

利益的事情,如果我建的IntelliJ IDEA嵌套控制檯中這樣說:mvn package exec:java -Dexec.mainClass=example.dataframe.ScalaDataFrameExample這是運行,並沒有錯誤。

這是相關Scala代碼部分:

val sc = new SparkContext(conf) 

    val sqlContext = new SQLContext(sc) 

    val url="jdbc:oracle:thin:@xxx.xxx.xx:1526:SIDNAME" 

    val prop = new java.util.Properties 

     prop.setProperty("user" , "usertst") 
     prop.setProperty("password" , "usertst") 

     val people = sqlContext.read.jdbc(url,"table_name",prop) 

     people.show() 

我檢查了我的jar文件,它containst所有的依賴關係。任何人都可以幫助我解決這個問題嗎?謝謝!

+0

複製粘貼? – Vale

+0

複製。請檢查問題。 – solarenqu

+0

和IntelliJIdea的作品,更正?你是在集羣上還是在本地進行這項工作?本地的 – Vale

回答

5

所以,缺少的驅動程序是JDBC驅動程序,您必須將其添加到SparkSQL配置中。你要麼做應用程序提交,按規定by this answer,或者你做你通過Properties對象,像你一樣,這一行:錯誤的相關部分的

prop.setProperty("driver", "oracle.jdbc.driver.OracleDriver") 
+0

Kudos!解決了我的問題:) – bigdatamann

相關問題