2015-12-16 114 views
6

我想對一個大項目中的某些數據做一個簡單的Spark並行處理,但即使最簡單的例子,我得到這個錯誤Spark:線程「main」中的異常java.lang.VerifyError:class com.fasterxml.jackson.module.scala.ser.ScalaIteratorSerializer

Exception in thread "main" java.lang.VerifyError: class com.fasterxml.jackson.module.scala.ser.ScalaIteratorSerializer overrides final method withResolved.(Lcom/fasterxml/jackson/databind/BeanProperty;Lcom/fasterxml/jackson/databind/jsontype/TypeSerializer;Lcom/fasterxml/jackson/databind/JsonSerializer;)Lcom/fasterxml/jackson/databind/ser/std/AsArraySerializerBase; 

的錯誤出現任何簡單的並行化,即使這個簡單的一個。我沒有在那裏這個錯誤甚至

val conf: SparkConf = new SparkConf().setAppName("IEEG Spark").setMaster("local") 
    val sc: SparkContext = new SparkContext(conf) 
    val data = Array(1, 2, 3, 4, 5) 
    val distVals = sc.parallelize(data) 
    distVals.foreach(println) 

附帶的任何想法,下面是我的Maven pom.xml文件

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> 

    <groupId>astiefel</groupId> 
    <artifactId>ieeg-spark</artifactId> 
    <modelVersion>4.0.0</modelVersion> 
    <name>Spark IEEG</name> 
    <parent> 
     <groupId>edu.upenn.cis.ieeg</groupId> 
     <artifactId>ieeg</artifactId> 
     <version>1.15-SNAPSHOT</version> 
    </parent> 
    <properties> 
     <scala.version>2.10.4</scala.version> 
    </properties> 
    <dependencies> 
     <dependency> 
      <groupId>edu.upenn.cis.ieeg</groupId> 
      <artifactId>ieeg-client</artifactId> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-core_2.10</artifactId> 
      <version>1.5.0</version> 
     </dependency> 
     <dependency> 
      <groupId>org.scala-lang</groupId> 
      <artifactId>scala-compiler</artifactId> 
      <version>${scala.version}</version> 
      <scope>compile</scope> 
     </dependency> 
     <dependency> 
      <groupId>org.scalanlp</groupId> 
      <artifactId>breeze_2.10</artifactId> 
      <version>0.10</version> 
     </dependency> 
    </dependencies> 

    <build> 
     <sourceDirectory>src/main/scala</sourceDirectory> 
     <plugins> 
      <plugin> 
       <groupId>org.scala-tools</groupId> 
       <artifactId>maven-scala-plugin</artifactId> 
       <executions> 
        <execution> 
         <goals> 
          <goal>compile</goal> 
          <goal>testCompile</goal> 
         </goals> 
        </execution> 
       </executions> 
       <configuration> 
        <scalaVersion>${scala.version}</scalaVersion> 
        <args> 
         <arg>-target:jvm-1.5</arg> 
        </args> 
       </configuration> 
      </plugin> 
      <plugin> 
       <groupId>org.apache.maven.plugins</groupId> 
       <artifactId>maven-eclipse-plugin</artifactId> 
       <configuration> 
        <downloadSources>true</downloadSources> 
        <buildcommands> 
         <buildcommand>ch.epfl.lamp.sdt.core.scalabuilder</buildcommand> 
        </buildcommands> 
        <additionalProjectnatures> 
         <projectnature>ch.epfl.lamp.sdt.core.scalanature</projectnature> 
        </additionalProjectnatures> 
        <classpathContainers> 
         <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer> 
         <classpathContainer>ch.epfl.lamp.sdt.launching.SCALA_CONTAINER</classpathContainer> 
        </classpathContainers> 
       </configuration> 
      </plugin> 
     </plugins> 
    </build> 
    <reporting> 
     <plugins> 
      <plugin> 
       <groupId>org.scala-tools</groupId> 
       <artifactId>maven-scala-plugin</artifactId> 
       <configuration> 
        <scalaVersion>${scala.version}</scalaVersion> 
       </configuration> 
      </plugin> 
     </plugins> 
    </reporting> 
</project> 

我這樣失去了這個錯誤,任何提示,以哪裏開始?

回答

0

你可以使用下面的依賴關係做一些簡單的Spark並行測試,它直接使用 spark-assembly-xxx.jar作爲jar依賴項。

<profile> 
     <id>apache-spark/scala</id> 
     <dependencies> 
      <!--Apache Spark --> 
      <dependency> 
       <groupId>org.apache.spark</groupId> 
       <artifactId>spark-assembly</artifactId> 
       <version>1.5.2</version> 
       <scope>system</scope> 
       <systemPath>${spark.home}/lib/spark-assembly-1.5.2-hadoop2.6.0.jar</systemPath> 
      </dependency> 
      <dependency> 
       <groupId>org.scala-lang</groupId> 
       <artifactId>scala-library</artifactId> 
       <version>2.10.2</version> 
      </dependency> 
     </dependencies> 
    </profile> 
+0

但我認爲問題是,我有使用傑克遜庫,但即使我改變它的版本,我還得到一個錯誤 – astiefel

+0

也許,我已經看到了關於聚甲醛的依賴問題,問題一大堆其他依賴,實際上組裝jar很適合啓動,因爲它是穩定版本幷包含所有內容 –

+0

我的解決方案是通過在我自己的pom中將更新的jackson模塊依賴於其他版本來覆蓋其他版本。似乎已經完成了這個訣竅 – astiefel

相關問題