2015-12-16 46 views
0

我正在嘗試將Apache Ignite集成到以Java編寫的現有Spark Streaming項目中,該項目用於統計文本文件中的單詞。不過,我得到一個類未找到錯誤,當我在依賴增加對點火火花:集成Spark Streaming和Apache Ignite時未發現類錯誤

java.lang.ClassNotFoundException: org.spark_project.protobuf.GeneratedMessage 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366) 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425) 
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358) 
    at java.lang.Class.forName0(Native Method) 
    at java.lang.Class.forName(Class.java:270) 
    at akka.actor.ReflectiveDynamicAccess$$anonfun$getClassFor$1.apply(DynamicAccess.scala:67) 
    at akka.actor.ReflectiveDynamicAccess$$anonfun$getClassFor$1.apply(DynamicAccess.scala:66) 
    at scala.util.Try$.apply(Try.scala:161) 
    at akka.actor.ReflectiveDynamicAccess.getClassFor(DynamicAccess.scala:66) 
    at akka.serialization.Serialization$$anonfun$6.apply(Serialization.scala:181) 
    at akka.serialization.Serialization$$anonfun$6.apply(Serialization.scala:181) 
    at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722) 
    at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:224) 
    at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:403) 
    at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721) 
    at akka.serialization.Serialization.<init>(Serialization.scala:181) 
    at akka.serialization.SerializationExtension$.createExtension(SerializationExtension.scala:15) 
    at akka.serialization.SerializationExtension$.createExtension(SerializationExtension.scala:12) 
    at akka.actor.ActorSystemImpl.registerExtension(ActorSystem.scala:713) 
    at akka.actor.ExtensionId$class.apply(Extension.scala:79) 
    at akka.serialization.SerializationExtension$.apply(SerializationExtension.scala:12) 
    at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:175) 
    at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:620) 
    at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:617) 
    at akka.actor.ActorSystemImpl._start(ActorSystem.scala:617) 
    at akka.actor.ActorSystemImpl.start(ActorSystem.scala:634) 
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:142) 
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:119) 
    at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121) 
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53) 
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52) 
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1920) 
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) 
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1911) 
    at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55) 
    at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:253) 
    at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:53) 
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:254) 
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:194) 
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:450) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:162) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:180) 
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:67) 
    at wordCount_test.TestSparkWordCount.setUp(TestSparkWordCount.java:25) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) 
    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) 
    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) 
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24) 
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) 
    at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) 
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70) 
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) 
    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) 
    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) 
    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) 
    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) 
    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) 
    at org.junit.runners.ParentRunner.run(ParentRunner.java:309) 
    at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50) 
    at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) 
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459) 
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675) 
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382) 
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)` 

這裏是我的pom.xml文件:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 
<modelVersion>4.0.0</modelVersion> 
<groupId>wordCount_Spark</groupId> 
<artifactId>wordCount_Spark</artifactId> 
<version>0.0.1-SNAPSHOT</version> 
<name>wordCount_Spark</name> 
<description>Count words using spark</description> 

<dependencies> 
<!-- Spark Streaming dependencies --> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-streaming_2.10</artifactId> 
     <version>1.5.2</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-core_2.10</artifactId> 
     <version>1.5.2</version> 
     <exclusions> 
      <exclusion> 
       <artifactId>protobuf-java</artifactId> 
       <groupId>com.google.protobuf</groupId> 
      </exclusion> 
     </exclusions> 
    </dependency> 
    <dependency> 
     <groupId>org.spark-project.protobuf</groupId> 
     <artifactId>protobuf-java</artifactId> 
     <version>2.5.0-spark</version> 
    </dependency> 
<!-- Apache Ignite dependencies --> 
    <dependency> 
     <groupId>org.apache.ignite</groupId> 
     <artifactId>ignite-core</artifactId> 
     <version>1.4.0</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.ignite</groupId> 
     <artifactId>ignite-spring</artifactId> 
     <version>1.4.0</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.ignite</groupId> 
     <artifactId>ignite-indexing</artifactId> 
     <version>1.4.0</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.ignite</groupId> 
     <artifactId>ignite-log4j</artifactId> 
     <version>1.4.0</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.ignite</groupId> 
     <artifactId>ignite-spark</artifactId> 
     <version>1.4.0</version> 
    </dependency> 
    <dependency> 
     <groupId>junit</groupId> 
     <artifactId>junit</artifactId> 
     <version>4.0</version> 
     <!-- <scope>test</scope> --> 
    </dependency> 
</dependencies> 

<build> 
    <plugins> 
     <plugin> 
      <artifactId>maven-compiler-plugin</artifactId> 
      <version>3.3</version> 
      <configuration> 
       <source>1.7</source> 
       <target>1.7</target> 
       <useIncrementalCompilation>false</useIncrementalCompilation> 
      </configuration> 
     </plugin> 
    </plugins> 
</build> 

我有嘗試在POM中切換Spark和Ignite的順序,但是當我嘗試運行我的JUnit測試時仍然會引發錯誤。這裏是我正在運行的測試:

package wordCount_test; 

import static org.junit.Assert.*; 

import java.io.File; 

import org.apache.spark.api.java.JavaPairRDD; 
import org.apache.spark.api.java.JavaRDD; 
import org.apache.spark.api.java.JavaSparkContext; 
import org.junit.After; 
import org.junit.Before; 
import org.junit.Test; 

public class TestSparkWordCount { 

    JavaSparkContext jsc; 
    File txtFile; 

    @Before 
    public void setUp() throws Exception { 
    jsc = new JavaSparkContext("local[2]", "testSparkWordCount"); 
    txtFile = new File("AIW_WordCount"); 
    if(txtFile.exists()){ 
     txtFile.delete(); 
    } 
    } 

    @After 
    public void tearDown() throws Exception { 
    if(jsc != null){ 
     jsc.stop(); 
     jsc = null; 
    } 
    } 

    @Test 
    public void testSparkInit() { 
    assertNotNull(jsc.sc()); 
    } 
} 

是否有可能在同一個項目中同時具有Ignite和Spark Streaming?我錯過了一個依賴關係嗎?我做錯了什麼或錯過了一步?

任何幫助或指導,你可以提供將不勝感激!

+0

歡迎堆棧溢出。你似乎缺少http://mvnrepository.com/artifact/org.spark-project.protobuf/protobuf-java/2.4.1-shaded在你的pom中? – Jan

+0

感謝您的快速回復。我將spark-project.protbuf依賴項添加到了pom中,但我仍然看到類沒有發現有關它的錯誤。我已經更新了上面的問題以反映我的新問題和錯誤。 – jencoston

+0

我在spark用戶列表中找到了一條評論(http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-1-0-and-Hadoop-2-4-java-lang -VerifyError-td16242.html)他們提到spark中的包已經從com.google.protobuf_spark.GeneratedMessage更改爲org.protobuf_spark.GeneratedMessage。所以,我嘗試向pom添加例外,但是,我仍然看到一條錯誤消息。我已經用新的pom更新了這個問題。 – jencoston

回答

1

如果我刪除了com.google.protobuf工件的排除項,該測試適用於我。

我現在的依存列表如下:

<dependencies> 
    <!-- Spark Streaming dependencies --> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-streaming_2.10</artifactId> 
     <version>1.5.2</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-core_2.10</artifactId> 
     <version>1.5.2</version> 
    </dependency> 
    <!-- Apache Ignite dependencies --> 
    <dependency> 
     <groupId>org.apache.ignite</groupId> 
     <artifactId>ignite-core</artifactId> 
     <version>1.4.0</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.ignite</groupId> 
     <artifactId>ignite-spring</artifactId> 
     <version>1.4.0</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.ignite</groupId> 
     <artifactId>ignite-indexing</artifactId> 
     <version>1.4.0</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.ignite</groupId> 
     <artifactId>ignite-log4j</artifactId> 
     <version>1.4.0</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.ignite</groupId> 
     <artifactId>ignite-spark</artifactId> 
     <version>1.4.0</version> 
    </dependency> 
    <dependency> 
     <groupId>junit</groupId> 
     <artifactId>junit</artifactId> 
     <version>4.0</version> 
     <!-- <scope>test</scope> --> 
    </dependency> 
</dependencies> 
+0

謝謝,除了刪除Maven存儲庫並重新下載修復它的jar文件外,還修改了pom文件。 – jencoston

相關問題