0

我正在使用IntelliJ 2016.3版本。線程「main」中的異常java.lang.NoClassDefFoundError:org/apache/spark/sql/SQLContext

import sbt.Keys._ 
import sbt._ 

object ApplicationBuild extends Build { 

    object Versions { 
    val spark = "1.6.3" 
    } 

    val projectName = "example-spark" 

    val common = Seq(
    version := "1.0", 
    scalaVersion := "2.11.7" 
) 

    val customLibraryDependencies = Seq(
    "org.apache.spark" %% "spark-core" % Versions.spark % "provided", 
    "org.apache.spark" %% "spark-sql" % Versions.spark % "provided", 
    "org.apache.spark" %% "spark-hive" % Versions.spark % "provided", 
    "org.apache.spark" %% "spark-streaming" % Versions.spark % "provided", 

    "org.apache.spark" %% "spark-streaming-kafka" % Versions.spark 
     exclude("log4j", "log4j") 
     exclude("org.spark-project.spark", "unused"), 

    "com.typesafe.scala-logging" %% "scala-logging" % "3.1.0", 

    "org.slf4j" % "slf4j-api" % "1.7.10", 

    "org.slf4j" % "slf4j-log4j12" % "1.7.10" 
     exclude("log4j", "log4j"), 

    "log4j" % "log4j" % "1.2.17" % "provided", 

    "org.scalatest" %% "scalatest" % "2.2.4" % "test" 
) 

我已經得到低於運行時異常,即使我如上所示正確地提到了所有的依賴關係。 Libraries- screen shot

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/SQLContext 
    at example.SparkSqlExample.main(SparkSqlExample.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) 
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SQLContext 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 
    ... 6 more 

調查了這個web.And發現,這主要是由於,適當的條目buld.sbt或版本mismatches.But在我的情況下,一切看起來不錯,如上圖所示。 請建議我在這裏做錯了什麼地方?

+0

你不應該使用'火花sql_2.11'和移動等類? – philantrovert

+0

@philantrovert!由於我們在提到依賴關係時使用了%%,因此sbt足夠智能,可以在scala版本後面加上下劃線。正如我們上面提到的scalaVersion:=「2.11.7」,sbt將它作爲2.11並將其作爲spark-sql_2.11最後附加到依賴項。 – Mahesh

回答

2

我想這是因爲你標記你的依賴關係爲「提供」,但顯然你(或IDEA)不提供它們。

嘗試刪除了「規定」的選項(我的首選方式):與主要方法src/test/scala

相關問題