我的整個build.sbt是:星火單元測試
name := """sparktest"""
version := "1.0.0-SNAPSHOT"
scalaVersion := "2.11.8"
scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8", "-Xexperimental")
parallelExecution in Test := false
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.2",
"org.apache.spark" %% "spark-sql" % "2.0.2",
"org.apache.avro" % "avro" % "1.8.1",
"org.scalatest" %% "scalatest" % "3.0.1" % "test",
"com.holdenkarau" %% "spark-testing-base" % "2.0.2_0.4.7" % "test"
)
我有一個簡單的測試。顯然,這只是一個起點,我想測試更多:
package sparktest
import com.holdenkarau.spark.testing.DataFrameSuiteBase
import org.scalatest.FunSuite
class SampleSuite extends FunSuite with DataFrameSuiteBase {
test("simple test") {
assert(1 + 1 === 2)
}
}
我跑sbt clean test
並獲得衰竭:
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf$ConfVars
對於我的開發環境,我使用的spark-2.0.2-bin-hadoop2.7.tar.gz
我必須以任何方式配置此環境嗎?顯然HiveConf是一個傳遞性的Spark依賴項
我認爲你必須明確地向你的依賴添加 '「org.apache.spark」%%「spark-hive」%「2.0.2」'。 –