我跟着這個鏈接:http://blog.miz.space/tutorial/2016/08/30/how-to-integrate-spark-intellij-idea-and-scala-install-setup-ubuntu-windows-mac/Unresolve依賴星火庫
當我嘗試編譯我的IntelliJ與項目,SBT是抱怨未解決的依賴
[Warn(警告)] ===公衆:試圖[警告] https://repol.maven.org/maven2/org/apache/spark/spark-core/2.1.1/spark-core-2.1.1.pom [警告]未解決的依賴性路徑:org.apache.spark:火花芯:2.1.1
我的階版本是2.12.2和sparkVersion是2.1 0.1
這裏是我的build.sbt樣子:
name := "test" version := "1.0" scalaVersion := "2.12.2"
val sparkVersion = "2.1.1"
libraryDependencies ++= Seq("org.apache.spark" % "spark-core" & sparkVersion)`
謝謝
必須使用Scala的2.10.x或2.11.x爲2.12階不被火花尚未 – Nonontb
星火社區正在斯卡拉2.12 support.Please遵循https://issues.apache.org/jira支持/ browse/SPARK-14220 – hadooper