4
我有使用BeforeAndAfterAll
構建SparkContext
和隨後停止它像這樣多ScalaTest類:如何在ScalaTest測試中正確使用Spark?
class MyTest extends FlatSpec with Matchers with BeforeAndAfterAll {
private var sc: SparkContext = null
override protected def beforeAll(): Unit = {
sc = ... // Create SparkContext
}
override protected def afterAll(): Unit = {
sc.stop()
}
// my tests follow
}
這些測試運行正常,從IntelliJ IDEA的啓動時,但運行sbt test
的時候,我得到WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243).
,之後,我猜想還有一些其他的例外情況與這個問題有關。
如何正確使用Spark?我是否必須爲整個測試套件創建一個全球SparkContext
,如果是,我該如何做?
我發現[(PDF)Testing Spark Best Practices](https://spark-summit.org/2014/wp-content/uploads/2014/06/Testing-Spark-Best-Practices-Anupama-Shetty-尼爾 - Marshall.pdf) – Daenyth