的Hadoop 2.4.0依賴於兩個不同版本的BeanUtils的,與sbt-assembly
引起以下錯誤:的Hadoop依賴於兩個不同版本的BeanUtils的
[error] (*:assembly) deduplicate: different file contents found in the following:
[error] .ivy2/cache/commons-beanutils/commons-beanutils/jars/commons-beanutils-1.7.0.jar:org/apache/commons/beanutils/BasicDynaBean.class
[error] .ivy2/cache/commons-beanutils/commons-beanutils-core/jars/commons-beanutils-core-1.8.0.jar:org/apache/commons/beanutils/BasicDynaBean.class
這些依賴關係兩者都是從傳遞的Hadoop 2.4.0,如確認使用How to access Ivy directly, i.e. access dependency reports or execute Ivy commands?
如何製作包含Hadoop 2.4.0的sbt-assembly?
UPDATE:按照要求,這裏是build.sbt依賴性:
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.4.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0" % "provided" exclude("org.apache.hadoop", "hadoop-client")
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.7.8"
libraryDependencies += "commons-io" % "commons-io" % "2.4"
libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1" % "provided"
libraryDependencies += "com.sksamuel.elastic4s" %% "elastic4s" % "1.1.1.0"
是需要的,因爲exclude hadoop
,開箱,火花包括Hadoop的1,其用Hadoop 2.
你可以在你的依賴中添加'build.sbt'嗎? – lpiepiora
@lpipiora - 完成了,你可以看看嗎? – SRobertJames
問題是,回購中的spark-core是針對Hadoop 1構建的。即使您現在解決了依賴關係問題,您也會遇到下一個問題(我已經測試過)。也許你可以考慮克隆Spark,並建立自己的版本反對Hadoop 2(Spark build似乎支持它) – lpiepiora