2014-10-10 58 views
3

我是Apache Spark的新手。我試圖根據以下準則在Windows 7上使用Scala的2.10.4安裝Apache 1.0.2火花:運行spark-shell時出錯:ERROR Remoting:遠程處理錯誤:[啓動失敗]

http://sankalplabs.wordpress.com/2014/08/25/installing-apache-spark-on-windows-step-by-step-approach/

當啓動火花殼牌我得到以下異常:

ERROR Remoting: Remoting error: [Startup failed] [ 
akka.remote.RemoteTransportException: Startup failed 
     at akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala 
:129) 
     at akka.remote.Remoting.start(Remoting.scala:194) 
     at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala: 
184) 
     at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579) 
     at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577) 
     at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588) 
     at akka.actor.ActorSystem$.apply(ActorSystem.scala:111) 
     at akka.actor.ActorSystem$.apply(ActorSystem.scala:104) 
     at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:10 
4) 
     at org.apache.spark.SparkEnv$.create(SparkEnv.scala:152) 
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:202) 
     at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala: 
957) 
     at $line3.$read$$iwC$$iwC.<init>(<console>:8) 
     at $line3.$read$$iwC.<init>(<console>:14) 
     at $line3.$read.<init>(<console>:16) 
     at $line3.$read$.<init>(<console>:20) 
     at $line3.$read$.<clinit>(<console>) 
     at $line3.$eval$.<init>(<console>:7) 
     at $line3.$eval$.<clinit>(<console>) 
     at $line3.$eval.$print(<console>) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) 
     at java.lang.reflect.Method.invoke(Unknown Source) 
     at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala: 
788) 
     at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala: 
1056) 
     at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614 
) 
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645) 
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609) 
     at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:7 
96) 
     at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sca 
la:841) 
     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753) 
     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply 
(SparkILoopInit.scala:121) 
     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply 
(SparkILoopInit.scala:120) 
     at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263) 
     at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoop 
Init.scala:120) 
     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56) 

     at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mc 
Z$sp$5.apply$mcV$sp(SparkILoop.scala:913) 
     at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.s 
cala:142) 
     at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56) 
     at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkIL 
oopInit.scala:104) 
     at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala: 
56) 
     at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(Spar 
kILoop.scala:930) 
     at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop. 
scala:884) 
     at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop. 
scala:884) 
     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass 
Loader.scala:135) 
     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884) 
     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982) 
     at org.apache.spark.repl.Main$.main(Main.scala:31) 
     at org.apache.spark.repl.Main.main(Main.scala) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) 
     at java.lang.reflect.Method.invoke(Unknown Source) 
     at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303) 
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55) 
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: Oleander 
/192.168.1.7:0 
     at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:2 
72) 
     at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(Ne 
ttyTransport.scala:391) 
     at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(Ne 
ttyTransport.scala:388) 
     at scala.util.Success$$anonfun$map$1.apply(Try.scala:206) 
     at scala.util.Try$.apply(Try.scala:161) 
     at scala.util.Success.map(Try.scala:206) 
     at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235) 
     at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235) 
     at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) 
     at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(Ba 
tchingExecutor.scala:67) 
     at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(Batc 
hingExecutor.scala:82) 
     at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExe 
cutor.scala:59) 
     at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExe 
cutor.scala:59) 
     at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72 
) 
     at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58) 
     at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:42) 
     at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(Abst 
ractDispatcher.scala:386) 
     at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
     at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool 
.java:1339) 
     at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:19 
79) 
     at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThre 
ad.java:107) 
Caused by: java.net.BindException: Cannot assign requested address: bind 
     at sun.nio.ch.Net.bind0(Native Method) 
     at sun.nio.ch.Net.bind(Unknown Source) 
     at sun.nio.ch.Net.bind(Unknown Source) 
     at sun.nio.ch.ServerSocketChannelImpl.bind(Unknown Source) 
     at sun.nio.ch.ServerSocketAdaptor.bind(Unknown Source) 
     at org.jboss.netty.channel.socket.nio.NioServerBoss$RegisterTask.run(Nio 
ServerBoss.java:193) 
     at org.jboss.netty.channel.socket.nio.AbstractNioSelector.processTaskQue 
ue(AbstractNioSelector.java:366) 
     at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNi 
oSelector.java:290) 
     at org.jboss.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.ja 
va:42) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) 
     at java.lang.Thread.run(Unknown Source) 
] 
org.jboss.netty.channel.ChannelException: Failed to bind to: Oleander/192.168.1. 
7:0 
     at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:2 
72) 
     at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(Ne 
ttyTransport.scala:391) 
     at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(Ne 
ttyTransport.scala:388) 
     at scala.util.Success$$anonfun$map$1.apply(Try.scala:206) 
     at scala.util.Try$.apply(Try.scala:161) 
     at scala.util.Success.map(Try.scala:206) 
     at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235) 
     at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235) 
     at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) 
     at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(Ba 
tchingExecutor.scala:67) 
     at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(Batc 
hingExecutor.scala:82) 
     at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExe 
cutor.scala:59) 
     at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExe 
cutor.scala:59) 
     at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72 
) 
     at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58) 
     at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:42) 
     at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(Abst 
ractDispatcher.scala:386) 
     at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
     at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool 
.java:1339) 
     at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:19 
79) 
     at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThre 
ad.java:107) 
Caused by: java.net.BindException: Cannot assign requested address: bind 
     at sun.nio.ch.Net.bind0(Native Method) 
     at sun.nio.ch.Net.bind(Unknown Source) 
     at sun.nio.ch.Net.bind(Unknown Source) 
     at sun.nio.ch.ServerSocketChannelImpl.bind(Unknown Source) 
     at sun.nio.ch.ServerSocketAdaptor.bind(Unknown Source) 
     at org.jboss.netty.channel.socket.nio.NioServerBoss$RegisterTask.run(Nio 
ServerBoss.java:193) 
     at org.jboss.netty.channel.socket.nio.AbstractNioSelector.processTaskQue 
ue(AbstractNioSelector.java:366) 
     at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNi 
oSelector.java:290) 
     at org.jboss.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.ja 
va:42) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) 
     at java.lang.Thread.run(Unknown Source) 

對我來說,似乎描述是不完整的。缺少哪些步驟?有沒有一個完整(簡潔)的步驟列表在Windows 7上安裝Spark?

你的幫助是非常讚賞,費利克斯

回答

5

異常走的時候我加了一個配置文件「火花env.cmd」到「的conf」文件夾。在該文件I指定的本地IP地址(這是我獲得經由IPCONFIG -all)如下:設置SPARK_LOCAL_IP = 192.168.1.111

替代地設置: SPARK_LOCAL_IP = LOCALHOST

+1

感謝。你救了我的一天! – 2014-11-12 09:03:13

0

對於此的類似的錯誤可能是有助於檢查環境變量的配置here

總之,安裝spark時默認情況下conf/spark-env.sh(或Windows上的conf/spark-env.cmd)不存在。因此,爲了修改環境變量,首先將模板文件conf/spark-env.sh.template複製到conf/spark-env.sh。在spark-env.sh文件中找到行

# - SPARK_LOCAL_IP, to set the IP address Spark binds to on this node 

,並更改爲

SPARK_LOCAL_IP=LOCALHOST 
相關問題