所以感謝容易googleable博客我想:如何在單元測試中抑制Spark記錄?
import org.specs2.mutable.Specification
class SparkEngineSpecs extends Specification {
sequential
def setLogLevels(level: Level, loggers: Seq[String]): Map[String, Level] = loggers.map(loggerName => {
val logger = Logger.getLogger(loggerName)
val prevLevel = logger.getLevel
logger.setLevel(level)
loggerName -> prevLevel
}).toMap
setLogLevels(Level.WARN, Seq("spark", "org.eclipse.jetty", "akka"))
val sc = new SparkContext(new SparkConf().setMaster("local").setAppName("Test Spark Engine"))
// ... my unit tests
不過遺憾的是它不工作,我還是得到了不少火花輸出,例如:
14/12/02 12:01:56 INFO MemoryStore: Block broadcast_4 of size 4184 dropped from memory (free 583461216)
14/12/02 12:01:56 INFO ContextCleaner: Cleaned broadcast 4
14/12/02 12:01:56 INFO ContextCleaner: Cleaned shuffle 4
14/12/02 12:01:56 INFO ShuffleBlockManager: Deleted all files for shuffle 4
適用於SBT,specs2 – samthebest 2014-12-10 10:36:52
謝謝@Emre。它在intelliJ的想法中像java一樣的魅力。 – 2017-01-21 06:46:28