2014-07-20 80 views
3

我遇到在Play中運行Apache Spark 1.0.1的問題!應用程序。目前,我正在嘗試在Play中運行Spark!應用和使用Spark內的一些基本機器學習。Apache Spark - java.lang.NoSuchMethodError:breeze.linalg.DenseVector

這裏是我的應用程序創建:

def sparkFactory: SparkContext = { 
    val logFile = "public/README.md" // Should be some file on your system 
    val driverHost = "localhost" 
    val conf = new SparkConf(false) // skip loading external settings 
     .setMaster("local[4]") // run locally with enough threads 
     .setAppName("firstSparkApp") 
     .set("spark.logConf", "true") 
     .set("spark.driver.host", s"$driverHost") 
    new SparkContext(conf) 
    } 

而且這裏的時候,我嘗試做一個高又瘦矩陣的一些基本發現是一個錯誤:

[error] o.a.s.e.ExecutorUncaughtExceptionHandler - Uncaught exception in thread Thread[Executor task launch worker-3,5,main] 
java.lang.NoSuchMethodError: breeze.linalg.DenseVector$.dv_v_ZeroIdempotent_InPlaceOp_Double_OpAdd()Lbreeze/linalg/operators/BinaryUpdateRegistry; 
    at org.apache.spark.mllib.linalg.distributed.RowMatrix$$anonfun$5.apply(RowMatrix.scala:313) ~[spark-mllib_2.10-1.0.1.jar:1.0.1] 
    at org.apache.spark.mllib.linalg.distributed.RowMatrix$$anonfun$5.apply(RowMatrix.scala:313) ~[spark-mllib_2.10-1.0.1.jar:1.0.1] 
    at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:144) ~[scala-library-2.10.4.jar:na] 
    at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:144) ~[scala-library-2.10.4.jar:na] 
    at scala.collection.Iterator$class.foreach(Iterator.scala:727) ~[scala-library-2.10.4.jar:na] 
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) ~[scala-library-2.10.4.jar:na] 

上述錯誤是由以下觸發:

def computePrincipalComponents(datasetId: String) = Action { 
    val datapoints = DataPoint.listByDataset(datasetId) 

    // load the data into spark 
    val rows = datapoints.map(_.data).map { row => 
     row.map(_.toDouble) 
    } 
    val RDDRows = WorkingSpark.context.makeRDD(rows).map { line => 
     Vectors.dense(line) 
    } 

    val mat = new RowMatrix(RDDRows) 
    val result = mat.computePrincipalComponents(mat.numCols().toInt) 


    Ok(result.toString) 
    } 

它看起來像一個依賴性問題,但不知道它從哪裏開始。有任何想法嗎?

回答

2

啊這確實是由依賴衝突造成的。顯然,新的Spark使用了新的Breeze方法,這些方法在我拉入的版本中不可用。通過從我的Play中移除Breeze!構建文件我能夠運行上面的功能就好了。

對於那些有興趣,這裏的輸出:

-0.23490049167080018 0.4371989078912155 0.5344916752692394 ... (6 total) 
-0.43624389448418854 0.531880914138611  0.1854269324452522 ... 
-0.5312372137092107 0.17954211389001487 -0.456583286485726 ... 
-0.5172743086226219 -0.2726152326516076 -0.36740474569706394 ... 
-0.3996400343756039 -0.5147253632175663 0.303449047782936  ... 
-0.21216780828347453 -0.39301803119012546 0.4943679121187219 ...