2015-01-14 21 views
2

我目前正使用的火花流一個項目,我有「火花提交」運行,但我打這個錯誤:星火流StreamingContext.start() - 錯誤開始接收0

15/01/14 10:34:18 ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting receiver 0 - java.lang.AbstractMethodError 
    at org.apache.spark.Logging$class.log(Logging.scala:52) 
    at org.apache.spark.streaming.kafka.KafkaReceiver.log(KafkaInputDStream.scala:66) 
    at org.apache.spark.Logging$class.logInfo(Logging.scala:59) 
    at org.apache.spark.streaming.kafka.KafkaReceiver.logInfo(KafkaInputDStream.scala:66) 
    at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:86) 
    at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121) 
    at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106) 
    at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264) 
    at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257) 
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121) 
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121) 
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62) 
    at org.apache.spark.scheduler.Task.run(Task.scala:54) 
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
    at java.lang.Thread.run(Thread.java:745) 

這是一個錯誤是來自該代碼,一切都正常運行,直到

val Array(zkQuorum, group, topics, numThreads) = args 
    val sparkConf = new SparkConf().setAppName("Jumbly_StreamingConsumer") 
    val ssc = new StreamingContext(sparkConf, Seconds(2)) 
    ssc.checkpoint("checkpoint") 
    . 
    . 
    . 
    ssc.start() 
    ssc.awaitTermination() 

我一直在使用「火花提交」和它運行良好,所以我不能運行SparkPi例如ssc.start()似乎找出是什麼原因導致了我的應用程序的問題,任何幫助將非常感激。

+0

聽起來像版本問題。檢查羣集中的Spark版本與依賴項中的Spark版本。 – maasg

+0

一切似乎都是按順序進行的,Spark版本是1.1.0,並且spark-core,spark-streaming和spark-streaming-kafka的依賴關係都是1.1.0 –

+0

。 。 。部分?如果你說它是ssc.start()失敗,那麼知道檢查點和那個之間有什麼關係似乎是相關的! –

回答

2

java.lang.AbstractMethod文檔:

Normally, this error is caught by the compiler; this error can only occur at run time if the definition of some class has incompatibly changed since the currently executing method was last compiled.

這意味着有編譯和運行時的依賴關係之間的版本不兼容。確保你對齊這些版本來解決這個問題。

+0

即檢查您的pom.xml或build.sbt,並確保MLLib版本與您的Spark版本相匹配。 – lythic