2016-11-23 84 views
2

我對下面的Java代碼運行在獲得運行時的錯誤,葛亭錯誤的NoClassDefFoundError:org.apache.spark.internal.Logging卡夫卡星火流

是否有任何依賴關係必須包括用於記錄像log4js或什麼?

爲什麼這個錯誤不會出現在編譯的時候,這樣更容易..

這裏是我的Java代碼,

SparkConf sparkConf = new SparkConf().setAppName("JavaKafkaWordCount11").setMaster("local[*]"); 
     sparkConf.set("spark.streaming.concurrentJobs", "3"); 

     // Create the context with 2 seconds batch size 
     JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, new Duration(3000)); 

     Map<String, Object> kafkaParams = new HashMap<>(); 
     kafkaParams.put("bootstrap.servers", "x.xx.xxx.xxx:9092"); 
     kafkaParams.put("key.deserializer", StringDeserializer.class); 
     kafkaParams.put("value.deserializer", StringDeserializer.class); 
     kafkaParams.put("group.id", "use_a_separate_group_id_for_each_stream"); 
     kafkaParams.put("auto.offset.reset", "latest"); 
     kafkaParams.put("enable.auto.commit", true); 

     Collection<String> topics = Arrays.asList("topicName"); 

     final JavaInputDStream<ConsumerRecord<String, String>> stream = KafkaUtils.createDirectStream(jssc, 
       LocationStrategies.PreferConsistent(), 
       ConsumerStrategies.<String, String>Subscribe(topics, kafkaParams)); 

     stream.mapToPair(new PairFunction<ConsumerRecord<String, String>, String, String>() { 
      @Override 
      public Tuple2<String, String> call(ConsumerRecord<String, String> record) throws Exception { 

       System.out.println("file data"); 
       return new Tuple2<>(record.key(), record.value()); 
      } 
     }); 

依賴使用,

<dependency> 
      <groupId>log4j</groupId> 
      <artifactId>log4j</artifactId> 
      <version>1.2.17</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-core_2.10</artifactId> 
      <version>1.6.2</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-streaming-kafka_2.10</artifactId> 
      <version>1.6.2</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-mllib_2.10</artifactId> 
      <version>1.6.2</version> 
     </dependency> 
     <!-- <dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka_2.10</artifactId> 
      <version>0.10.0.1</version> </dependency> --> 
     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-streaming-kafka-0-10_2.10</artifactId> 
      <version>2.0.0</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-streaming-twitter_2.10</artifactId> 
      <version>1.6.2</version> 
     </dependency> 
     <dependency> 
      <groupId>org.restlet.jee</groupId> 
      <artifactId>org.restlet</artifactId> 
      <version>2.0.10</version> 
     </dependency> 

獲得下面的錯誤,

Exception in thread "main" java.lang.NoClassDefFoundError: org.apache.spark.internal.Logging 
    at java.lang.ClassLoader.defineClassImpl(Native Method) 
    at java.lang.ClassLoader.defineClass(ClassLoader.java:346) 
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:154) 
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:727) 
    at java.net.URLClassLoader.access$400(URLClassLoader.java:95) 
    at java.net.URLClassLoader$ClassFinder.run(URLClassLoader.java:1182) 
    at java.security.AccessController.doPrivileged(AccessController.java:686) 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:602) 
    at java.lang.ClassLoader.loadClassHelper(ClassLoader.java:846) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:825) 
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:325) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:805) 
    at org.apache.spark.streaming.kafka010.ConsumerStrategies.Subscribe(ConsumerStrategy.scala) 
    at spark.KafkaConsumerDirectStream.main(KafkaConsumerDirectStream.java:45) 
Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:607) 
    at java.lang.ClassLoader.loadClassHelper(ClassLoader.java:846) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:825) 
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:325) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:805) 
    ... 14 more 
+0

該問題通常來自類路徑缺少的依賴關係。有許多方法可以設置Classpath,你能告訴我們你使用的是什麼方法嗎? –

+0

上面的代碼在Main方法中。 –

回答

2

問題得到了通過建立上述提到的依賴關係解析進入下面的命令,

<dependency> 
      <groupId>log4j</groupId> 
      <artifactId>log4j</artifactId> 
      <version>1.2.17</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-core_2.10</artifactId> 
      <version>2.0.0</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-streaming-kafka-0-10_2.10</artifactId> 
      <version>2.0.0</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-mllib_2.10</artifactId> 
      <version>2.0.0</version> 
     </dependency> 
0

import org.apache.spark.Logging,它在Spark版本1.5.2或更高版本中可用,所以我建議您使用1.5.2或更高版本的spark。

你正在使用哪種spark版本?

還有另一個解決此問題的依賴項,並且與Spark 2.x兼容。

對於SBT,使用這種依賴性:

「org.apache.bahir」 %% 「火花流,Twitter的」 % 「2.0.0」

+0

請從我更新的問題中查找版本和依賴關係 –

相關問題