2015-08-25 19 views
1

我正在試驗Spark Kafka集成。我想測試我的eclipse IDE中的代碼。但是,我得到了以下錯誤:Spark Kafka - 從Eclipse IDE運行時發出的問題

java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class 
    at kafka.utils.Pool.<init>(Pool.scala:28) 
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<init>(FetchRequestAndResponseStats.scala:60) 
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$.<clinit>(FetchRequestAndResponseStats.scala) 
    at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39) 
    at org.apache.spark.streaming.kafka.KafkaCluster.connect(KafkaCluster.scala:52) 
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply(KafkaCluster.scala:345) 
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers$1.apply(KafkaCluster.scala:342) 
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) 
    at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:35) 
    at org.apache.spark.streaming.kafka.KafkaCluster.org$apache$spark$streaming$kafka$KafkaCluster$$withBrokers(KafkaCluster.scala:342) 
    at org.apache.spark.streaming.kafka.KafkaCluster.getPartitionMetadata(KafkaCluster.scala:125) 
    at org.apache.spark.streaming.kafka.KafkaCluster.getPartitions(KafkaCluster.scala:112) 
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:403) 
    at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:532) 
    at org.apache.spark.streaming.kafka.KafkaUtils.createDirectStream(KafkaUtils.scala) 
    at com.capiot.platform.spark.SparkTelemetryReceiverFromKafkaStream.executeStreamingCalculations(SparkTelemetryReceiverFromKafkaStream.java:248) 
    at com.capiot.platform.spark.SparkTelemetryReceiverFromKafkaStream.main(SparkTelemetryReceiverFromKafkaStream.java:84) 

UPDATE: ,我使用的版本是:

  • 斯卡拉 - 2.11
  • 火花流-kafka- 1.4.1
  • spark - 1.4.1

任何人都可以解決問題嗎?提前致謝。

回答

0

可能會遲到幫助OP,但是當使用包含spark的kafka流時,您需要確保使用正確的jar文件。

例如,在我的情況下,我有階2.11(在用於火花2.0中使用即時通訊,其所需最小),並且考慮到卡夫卡火花需要版本2.0.0我必須使用僞影spark-streaming-kafka-0-8-assembly_2.11-2.0.0-preview.jar

通知我的斯卡拉版本和神器版本可以在2.11-2.0.0

希望這有助於(某人)

希望幫助中可以看出。

相關問題