2016-09-22 39 views
0

我通過intellij運行spark工作。作業執行並給出輸出。我需要這份工作的jar文件到服務器並運行,但是當我嘗試做sbt assembly它拋出以下錯誤:SBT裝配falis

[error] Not a valid command: assembly 
[error] Not a valid project ID: assembly 
[error] Expected ':' (if selecting a configuration) 
[error] Not a valid key: assembly 
[error] assembly 

我SBT版本是0.13.8

下面

是我build.sbt文件:

import sbt._, Keys._ 
name := "mobilewalla" 
version := "1.0" 
scalaVersion := "2.11.7" 
libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" % "2.0.0", 
"org.apache.spark" %% "spark-sql" % "2.0.0") 

我在項目目錄下添加了一個文件assembly.sbt。它包含:

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3") 

我在這裏

+0

安裝sbt程序集後重新加載了嗎? –

+0

是的,我做到了。是通過Build.sbt文件正確嗎?我需要添加任何東西嗎? – toofrellik

+0

'sbt package'創建你的JAR文件以提交spark ... https://spark.apache.org/docs/2.0.0/quick-start.html#self-contained-applications –

回答

1

在build.sbt

assemblyMergeStrategy in assembly := { 
    case PathList("META-INF", xs @ _*) => MergeStrategy.discard 
    case x => MergeStrategy.first 
} 
mainClass in assembly := Some("com.SparkMain") 
resolvers += "spray repo" at "http://repo.spray.io" 
assemblyJarName in assembly := "streaming-api.jar" 

添加這些行,包括這些行在你的plugins.sbt文件中

addSbtPlugin("io.spray" % "sbt-revolver" % "0.7.2") 

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0") 
+0

你可以讓我知道爲什麼我們需要插件文件中的'sbt-revolver'設置 – toofrellik

+0

這是一個非常有用的插件,它可以在分叉的JVM中運行您的項目,並在更改時重新加載它。 – Nilesh

0

缺什麼要組裝多個罐子一個u需要在plugins.sbt下面添加插件項目目錄下。

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3") 

如果妳需要定製組裝罐子觸發特定MainClass採取例如assembly.sbt

import sbtassembly.Plugin.AssemblyKeys._ 

Project.inConfig(Compile)(baseAssemblySettings) 

mainClass in (Compile, assembly) := Some("<main application name with package path>") 

jarName in (Compile, assembly) := s"${name.value}-${version.value}-dist.jar" 
//below is merge strategy to make what all file need to exclude or include 
mergeStrategy in (Compile, assembly) <<= (mergeStrategy in (Compile, assembly)) { 
    (old) => { 
    case PathList(ps @ _*) if ps.last endsWith ".html" =>MergeStrategy.first 
    case "META-INF/MANIFEST.MF" => MergeStrategy.discard 
    case x => old(x) 
    } 
} 
+0

我不認爲這些行對於Apache Spark項目是必需的 –

+0

我在ref中添加了assembly.sbt文件:[sbt assembly](https://github.com/sbt/sbt-assembly),因爲我的sbt版本是0.13.8 – toofrellik

+0

您仍然應該嘗試使用plugins.sbt。如果這個文檔是完美的,那麼你首先不會在SO上問一個問題:) – C4stor