2016-12-16 24 views
4

我運行火花字數統計程序,但我提示以下錯誤: 我已經加入scala-xml_2.11-1.0.2.jar

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
    16/12/16 05:14:02 INFO SparkContext: Running Spark version 2.0.2 
    16/12/16 05:14:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
    16/12/16 05:14:03 WARN Utils: Your hostname, ubuntu resolves to a loopback address: 127.0.1.1; using 192.168.59.132 instead (on interface ens33) 
    16/12/16 05:14:03 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 
    16/12/16 05:14:04 INFO SecurityManager: Changing view acls to: hadoopusr 
    16/12/16 05:14:04 INFO SecurityManager: Changing modify acls to: hadoopusr 
    16/12/16 05:14:04 INFO SecurityManager: Changing view acls groups to: 
    16/12/16 05:14:04 INFO SecurityManager: Changing modify acls groups to: 
    16/12/16 05:14:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoopusr); groups with view permissions: Set(); users with modify permissions: Set(hadoopusr); groups with modify permissions: Set() 
    16/12/16 05:14:05 INFO Utils: Successfully started service 'sparkDriver' on port 40559. 
    16/12/16 05:14:05 INFO SparkEnv: Registering MapOutputTracker 
    16/12/16 05:14:05 INFO SparkEnv: Registering BlockManagerMaster 
    16/12/16 05:14:05 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-0b830180-ae51-451f-9673-4f98dbaff520 
    16/12/16 05:14:05 INFO MemoryStore: MemoryStore started with capacity 433.6 MB 
    16/12/16 05:14:05 INFO SparkEnv: Registering OutputCommitCoordinator 
    Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$; 
     at org.apache.spark.ui.jobs.StagePage.<init>(StagePage.scala:44) 
     at org.apache.spark.ui.jobs.StagesTab.<init>(StagesTab.scala:34) 
     at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:62) 
     at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:219) 
     at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:161) 
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:440) 
     at LearnScala.WordCount$.main(WordCount.scala:15) 
     at LearnScala.WordCount.main(WordCount.scala) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:498) 
     at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) 
    16/12/16 05:14:05 INFO DiskBlockManager: Shutdown hook called 
    16/12/16 05:14:05 INFO ShutdownHookManager: Shutdown hook called 
    16/12/16 05:14:05 INFO ShutdownHookManager: Deleting directory /tmp/spark-789e9a76-894f-468b-a39a-cf00da30e4ba/userFiles-3656d5f8-25ba-45c4-b2f6-9f654a049bb1 
    16/12/16 05:14:05 INFO ShutdownHookManager: Deleting directory /tmp/spark-789e9a76-894f-468b-a39a-cf00da30e4ba 

我使用的是下面的版本

build.SBT

name := "SparkApps" 

version := "1.0" 

scalaVersion := "2.11.5" 

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.2" 
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10 
libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "2.0.2" 
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.10 
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "2.0.2" 
// https://mvnrepository.com/artifact/org.apache.spark/spark-yarn_2.11 
libraryDependencies += "org.apache.spark" % "spark-yarn_2.10" % "2.0.2" 

星火版本:2.0.2

回答

5

I am running a word count program in spark but i am getting the below error I have added scala-xml_2.11-1.0.2.jar

後來,我們可以看到:

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.2" 

選擇一個;)斯卡拉2.10或2.11斯卡拉。將Scala-XML版本更改爲2.10或將Spark更改爲2.11。從Spark 2.0開始,建議使用Scala 2.11。

您可以輕鬆地在build.sbt加入%%加適量的Scala版本:

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.2" 

其次,在build.sbt沒有關於斯卡拉-XML dependecy信息 - 你應該添加它。

最後,你必須添加的所有第三方罐子通過--jars選項火花提交或建立超級罐子 - 見this問題

+1

我不明白爲什麼這個答案還沒有被標記給予確切需要的解決方案之後所接受。 –

相關問題