2015-10-20 28 views
4

我有使用BeforeAndAfterAll構建SparkContext和隨後停止它像這樣多ScalaTest類:如何在ScalaTest測試中正確使用Spark?

class MyTest extends FlatSpec with Matchers with BeforeAndAfterAll { 

    private var sc: SparkContext = null 

    override protected def beforeAll(): Unit = { 
    sc = ... // Create SparkContext 
    } 

    override protected def afterAll(): Unit = { 
    sc.stop() 
    } 

    // my tests follow 
} 

這些測試運行正常,從IntelliJ IDEA的啓動時,但運行sbt test的時候,我得到WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243).,之後,我猜想還有一些其他的例外情況與這個問題有關。

如何正確使用Spark?我是否必須爲整個測試套件創建一個全球SparkContext,如果是,我該如何做?

+1

我發現[(PDF)Testing Spark Best Practices](https://spark-summit.org/2014/wp-content/uploads/2014/06/Testing-Spark-Best-Practices-Anupama-Shetty-尼爾 - Marshall.pdf) – Daenyth

回答

2

好像我忽視了木材的樹木,我忘了在我build.sbt以下行:

parallelExecution in test := false 

這一行,試運行。

相關問題