2015-11-06 176 views
0

我無法在Spark 1.4.1上啓動spark-shell無法在Spark 1.4.1b羣集上啓動spark-shell羣集

這是在EMR實例上啓動的。 上次我發佈Spark 1.4時,我能夠啓動Spark-Shell。我不確定爲什麼這次失敗。

歡迎任何建議。

登錄時我啓動火花殼從終端:

`

[[email protected] ~]spark-shell 
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0 
15/11/06 18:30:17 INFO spark.SecurityManager: Changing view acls to: hadoop 
15/11/06 18:30:17 INFO spark.SecurityManager: Changing modify acls to: hadoop 
15/11/06 18:30:17 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop) 
15/11/06 18:30:17 INFO spark.HttpServer: Starting HTTP Server 
15/11/06 18:30:17 INFO server.Server: jetty-8.y.z-SNAPSHOT 
15/11/06 18:30:17 INFO server.AbstractConnector: Started [email protected]:50579 
15/11/06 18:30:17 INFO util.Utils: Successfully started service 'HTTP class server' on port 50579. 
error: error while loading <root>, zip file is empty 

Failed to initialize compiler: object scala.runtime in compiler mirror not found. 
** Note that as of 2.8 scala does not assume use of the java classpath. 
** For the old behavior pass -usejavacp to scala, or if using a Settings 
** object programatically, settings.usejavacp.value = true. 
15/11/06 18:30:17 WARN repl.SparkILoop$SparkILoopInterpreter: Warning: compiler accessed before init set up. Assuming no postInit code. 

Failed to initialize compiler: object scala.runtime in compiler mirror not found. 
** Note that as of 2.8 scala does not assume use of the java classpath. 
** For the old behavior pass -usejavacp to scala, or if using a Settings 
** object programatically, settings.usejavacp.value = true. 
Exception in thread "main" java.lang.AssertionError: assertion failed: null 
    at scala.Predef$.assert(Predef.scala:179) 
    at org.apache.spark.repl.SparkIMain.initializeSynchronous(SparkIMain.scala:247) 
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990) 
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) 
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) 
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) 
    at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) 
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) 
    at org.apache.spark.repl.Main$.main(Main.scala:31) 
    at org.apache.spark.repl.Main.main(Main.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:497) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
15/11/06 18:30:17 INFO util.Utils: Shutdown hook called 
15/11/06 18:30:17 INFO util.Utils: Deleting directory /tmp/spark-3499c189-36cf-48b0-9c6f-fa97b8d89f86 

`

回答

0

看起來像一個路徑問題。你的Scala腳本在你的PATH環境變量中的位置?

在的.bash_profile或.bashrc中:

export PATH=$PATH:/path/to/scala 

這是令人驚訝的是,Scala的安裝並沒有將它添加到/ usr/local/bin目錄,雖然。