2015-08-27 124 views
4

喜時出錯局部類不兼容我在http://spark.apache.org/docs/1.2.0/quick-start.html#self-contained-applications阿帕奇星火:啓動SparkContext類

火花版本上星火網站runnign例如:火花1.4.0

SBT版本:0.13.8

然後我運行命令「sbt run」並得到錯誤「java.io.InvalidClassException:org.apache.spark.deploy.ApplicationDescription; local class incompatible」。

當我嘗試啓動SparkContext類時,此應用程序在「val sc = new SparkContext(conf)」時失敗。我搜索了一下,看到this post,但我沒有使用hadoop-client。

你可以看看嗎?我的猜測是build.sbt中的版本問題。非常感謝你。

更新:我已經嘗試提交python應用程序並正常工作,這意味着Spark集羣是可以的。

Scala代碼低於:

/* SimpleApp.scala */ 
import org.apache.spark.SparkContext 
import org.apache.spark.SparkContext._ 
import org.apache.spark.SparkConf 

object SimpleApp { 
    def main(args: Array[String]) { 
    val logFile = "YOUR_SPARK_HOME/README.md" // Should be some file on your system 
    val conf = new SparkConf().setAppName("Simple Application") 
    val sc = new SparkContext(conf) 
    val logData = sc.textFile(logFile, 2).cache() 
    val numAs = logData.filter(line => line.contains("a")).count() 
    val numBs = logData.filter(line => line.contains("b")).count() 
    println("Lines with a: %s, Lines with b: %s".format(numAs, numBs)) 
    } 
} 

built.sbt低於:

name := "Simple Project" 

version := "1.0" 

scalaVersion := "2.10.4" 

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0" 

錯誤信息如下:

15/08/27 05:23:38 ERROR Remoting: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = -7685200927816255400 
java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = -7685200927816255400 
    at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617) 
    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622) 
    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) 
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) 
    at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136) 
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) 
    at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136) 
    at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) 
    at scala.util.Try$.apply(Try.scala:161) 
    at akka.serialization.Serialization.deserialize(Serialization.scala:98) 
    at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:63) 
    at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) 
    at scala.util.Try$.apply(Try.scala:161) 
    at akka.serialization.Serialization.deserialize(Serialization.scala:98) 
    at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23) 
    at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58) 
    at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58) 
    at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76) 
    at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937) 
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465) 
    at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415) 
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) 
    at akka.actor.ActorCell.invoke(ActorCell.scala:487) 
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) 
    at akka.dispatch.Mailbox.run(Mailbox.scala:220) 
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) 
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) 
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) 
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 

回答

5

你說你是一個運行Spark 1.4.0羣集,但您的build.sbt正在搭建1.2.0。請更改這個在您的build.sbt:

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.0" 
+0

哦,你是正確的,非常感謝......可恥的我...哈哈 – keypoint

+0

沒有概率,沒點敲打你的頭agianst牆上;-) –