我們的Java應用程序依賴於Spark,它是用Scala編寫的。構建工具是Maven,並在Eclipse中運行。用於使用Maven在命令行上編譯應用程序的JDK_HOME和用於在Eclipse中運行的JRE都是1.7.0_15
。Scala包引發java.lang.UnsupportedClassVersionError
Maven的POM包含以下內容:
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
...
<configuration>
<scalaVersion>1.10.5</scalaVersion>
<args>
<arg>-target:jvm-1.7</arg>
</args>
</configuration>
</plugin>
據我瞭解,星火使用Scala的2.10
建Maven的依賴關係包括以下內容:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-hadoop</artifactId>
<version>2.1.0.Beta4</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-spark_2.10</artifactId>
<version>2.1.0.Beta4</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-xml</artifactId>
<version>2.11.0-M4</version>
</dependency>
<dependency>
<groupId>org.scala-lang.modules</groupId>
<artifactId>scala-parser-combinators_2.12.0-M2</artifactId>
<version>1.0.4</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.3.0</version>
</dependency
>
在運行時,引發以下異常:
Exception in thread "main" java.lang.UnsupportedClassVersionError: scala/util/parsing/combinator/PackratParsers : Unsupported major.minor version 52.0
我找不到scala-parser-combinators
罐子的2.10。*版本。
任何人都可以協助解決方案嗎?
謝謝!