2015-10-08 130 views
6

我正在使用Spark 1.4.1。 我可以使用spark-submit沒有問題。 但是,當我跑~/spark/bin/spark-shell無法啓動火花外殼

我得到了下面 錯誤我已經配置SPARK_HOMEJAVA_HOME。 然而,正是有了1.2

15/10/08 02:40:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 

Failed to initialize compiler: object scala.runtime in compiler mirror not found. 
** Note that as of 2.8 scala does not assume use of the java classpath. 
** For the old behavior pass -usejavacp to scala, or if using a Settings 
** object programatically, settings.usejavacp.value = true. 

Failed to initialize compiler: object scala.runtime in compiler mirror not found. 
** Note that as of 2.8 scala does not assume use of the java classpath. 
** For the old behavior pass -usejavacp to scala, or if using a Settings 
** object programatically, settings.usejavacp.value = true. 
Exception in thread "main" java.lang.AssertionError: assertion failed: null 
     at scala.Predef$.assert(Predef.scala:179) 
     at org.apache.spark.repl.SparkIMain.initializeSynchronous(SparkIMain.scala:247) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) 
     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) 
     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) 
     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) 
     at org.apache.spark.repl.Main$.main(Main.scala:31) 
     at org.apache.spark.repl.Main.main(Main.scala) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665) 
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170) 
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193) 
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112) 
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
+0

你在哪裏設置了'SPARK_HOME'?在你的.bashrc?導致你得到的錯誤是由於SPARK_HOME未設置爲'spark-shell'試圖從'dirname'中找到它。 – user1314742

+0

我應該如何設置我的SPARK_HOME?是否應該設置爲導出SPARK_HOME =/usr/local/Cellar/apache-spark/2.2.0/bin? – lordlabakdas

+0

我不認爲這個問題是SPARK_HOME。不正確的SPARK_HOME將導致spark-shell腳本無法找到spark-submit。但是,當我確保SPARK_HOME和我直接調用「spark-submit -class org.apache.spark.repl.Main」時,我在機器上看到了同樣的錯誤。 –

回答

0

你有沒有安裝Scala和SBT星火OK?
日誌表示它沒有找到主類。

+0

你認爲這是由sbt和scala引起的,不是放在PATH中嗎? – worldterminator

1

我在運行spark時遇到了同樣的問題,但是我發現這是我沒有正確配置scala的原因。 請確保您有Java中,Scala和SBT安裝和火花建:

編輯您的.bashrc文件 VIM的.bashrc

設置你的ENV變量:

export JAVA_HOME=/usr/lib/jvm/java-7-oracle 
export PATH=$JAVA_HOME:$PATH 

export SCALA_HOME=/usr/local/src/scala/scala-2.11.5 
export PATH=$SCALA_HOME/bin:$PATH 

export SPARK_HOME=/usr/local/src/apache/spark.2.0.0/spark 
export PATH=$SPARK_HOME/bin:$PATH 

源設置 。 .bashrc中

檢查斯卡拉 階-version

確保REPL開始 斯卡拉

如果你擊退開始嘗試並重新啓動您的火花外殼。 ./path/to/spark/bin/spark-shell

你應該得到的火花REPL

1

你可以嘗試運行

spark-shell -usejavacp 

它沒有爲我工作,但它確實在Spark Issue 18778的描述中爲某人工作。