2016-10-03 39 views
0

我試圖運行一個簡單的kafka火花流示例。這是我得到的錯誤。試圖運行一個簡單的火花流kafka示例時出現錯誤

16/10/02 20:45:43 INFO SparkEnv: Registering OutputCommitCoordinator Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$; at org.apache.spark.ui.jobs.StagePage.(StagePage.scala:44) at org.apache.spark.ui.jobs.StagesTab.(StagesTab.scala:34) at org.apache.spark.ui.SparkUI.(SparkUI.scala:62) at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:215) at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:157) at org.apache.spark.SparkContext.(SparkContext.scala:443) at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:836) at org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:84) at org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaStreamingContext.scala:138) at com.application.SparkConsumer.App.main(App.java:27)

我正在使用下面的pom設置此示例。我試圖找到這個缺少的scala.Predef類,併爲spark-streaming-kafka-0-8-assembly添加了缺失的依賴項,並且當我探索這個jar時,我可以看到該類。

<dependency> 
    <groupId>org.apache.kafka</groupId> 
    <artifactId>kafka_2.11</artifactId> 
    <version>0.8.2.0</version> 
</dependency> 
<dependency> 
    <groupId>org.apache.kafka</groupId> 
    <artifactId>kafka-clients</artifactId> 
    <version>0.8.2.0</version> 
</dependency> 
<dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-core_2.11</artifactId> 
     <version>2.0.0</version> 
     <scope>provided</scope> 
</dependency> 
<dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-streaming_2.11</artifactId> 
     <version>2.0.0</version> 
     <scope>provided</scope> 
</dependency> 
<dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-streaming-kafka-0-8_2.11</artifactId> 
    <version>2.0.0</version> 
</dependency> 
<dependency> 
    <groupId>org.scala-lang</groupId> 
    <artifactId>scala-library</artifactId> 
    <version>2.11.0</version> 
</dependency> 
<dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-streaming-kafka-0-8-assembly_2.11</artifactId> 
    <version>2.0.0</version> 
</dependency> 

我已經嘗試了一個簡單的火花字計數示例,它工作正常。當我使用這個spark-streaming-kafka時,我遇到了麻煩。我試圖查找這個錯誤,但沒有運氣。

這是代碼片段。

 SparkConf sparkConf = new SparkConf().setAppName("someapp").setMaster("local[2]"); 
     // Create the context with 2 seconds batch size 
     JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, new Duration(2000)); 

     int numThreads = Integer.parseInt(args[3]); 
     Map<String, Integer> topicMap = new HashMap<String,Integer>(); 
     topicMap.put("fast-messages", 1); 
     Map<String, String> kafkaParams = new HashMap<String,String>(); 
     kafkaParams.put("metadata.broker.list", "localhost:9092"); 
     JavaPairReceiverInputDStream<String, String> messages = 
     KafkaUtils.createStream(jssc,"zoo1","my-consumer-group", topicMap); 

回答

0

當我使用0.8.2.0 kafka的2.11似乎有問題。切換到2.10後,它工作正常。

相關問題