我無法編譯scala代碼,彈出依賴關係錯誤!依賴不解決
但我還有一個示例應用程序的正常工作與現有的設置,此代碼不能在同一拱工作
:錯誤日誌
:sbt compile
[info] Set current project to firstScalaScript (in build file:/home/abu/Current%20Workspace/)
[info] Updating {file:/home/abu/Current%20Workspace/}current-workspace...
[info] Resolving org.apache.spark#spark-core;2.0.1 ...
[warn] module not found: org.apache.spark#spark-core;2.0.1
[warn] ==== local: tried
[warn] /home/abu/.ivy2/local/org.apache.spark/spark-core/2.0.1/ivys/ivy.xml
[warn] ==== public: tried
[warn] https://repo1.maven.org/maven2/org/apache/spark/spark-core/2.0.1/spark-core-2.0.1.pom
[warn] ==== Akka Repository: tried
[warn] http://repo.akka.io/releases/org/apache/spark/spark-core/2.0.1/spark-core-2.0.1.pom
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.apache.spark#spark-core;2.0.1: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
我的代碼:
import scala.io.Source._
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.log4j.Logger
import org.apache.log4j.Level
import org.apache.spark.rdd.RDD
import org.apache.hadoop.io.compress.GzipCodec
object firstScalaScript{
def main(args: Array[String])
{
val sc=new SparkContext(new SparkConf())
val rdd=sc.textFile("e.txt,r.txt").collect()
//rdd.saveAsTextFile("confirmshot.txt"); sc.stop()
}
}
我的代碼:進口scala.io.Source._ 進口org.apache.spark.SparkContext 進口org.apache.spark.SparkContext._ 進口org.apache.spark.SparkConf 進口org.apache.log4j .Logger 進口org.apache.log4j.Level 進口org.apache.spark.rdd.RDD 進口org.apache.hadoop.io.compress.GzipCodec 對象firstScalaScript { DEF主(參數:數組[String]){ \t val sc = new SparkContext(new SparkConf()) \t val rdd = sc.textFile(「e.txt,r.txt」)。collect() \t //rdd.saveAsTextFile ("confirmshot.txt「); \t sc。stop() \t} } –
請編輯您的問題與您的代碼塊(並設置適當的格式),而不是使用註釋爲同一目的。 –