2015-03-02 125 views
0

我在集羣模式下啓動獨立Spark驅動程序時出錯。根據文檔,需要注意的是,Spark 1.2.1版本支持集羣模式。但是,它目前對我來說工作不正常。請幫助我解決妨礙Spark正常運行的問題。
我有3個節點火花羣集節點1,節點2和節點3spark-submit集羣模式不起作用

I running below command on node 1 for deploying driver 

/usr/local/spark-1.2.1-bin-hadoop2.4/bin/spark-submit --class com.fst.firststep.aggregator.FirstStepMessageProcessor --master spark://ec2-xx-xx-xx-xx.compute-1.amazonaws.com:7077 --deploy-mode cluster --supervise file:///home/xyz/sparkstreaming-0.0.1-SNAPSHOT.jar /home/xyz/config.properties 

driver gets launched on node 2 in cluster. but getting exception on node 2 that it is trying to bind to node 1 ip. 

2015-02-26 08:47:32 DEBUG AkkaUtils:63 - In createActorSystem, requireCookie is: off 
2015-02-26 08:47:32 INFO Slf4jLogger:80 - Slf4jLogger started 
2015-02-26 08:47:33 ERROR NettyTransport:65 - failed to bind to ec2-xx.xx.xx.xx.compute-1.amazonaws.com/xx.xx.xx.xx:0, shutting down Netty transport 
2015-02-26 08:47:33 WARN Utils:71 - Service 'Driver' could not bind on port 0. Attempting port 1. 
2015-02-26 08:47:33 DEBUG AkkaUtils:63 - In createActorSystem, requireCookie is: off 
2015-02-26 08:47:33 ERROR Remoting:65 - Remoting error: [Startup failed] [ 
akka.remote.RemoteTransportException: Startup failed 
     at akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:136) 
     at akka.remote.Remoting.start(Remoting.scala:201) 
     at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184) 
     at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:618) 
     at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:615) 
     at akka.actor.ActorSystemImpl._start(ActorSystem.scala:615) 
     at akka.actor.ActorSystemImpl.start(ActorSystem.scala:632) 
     at akka.actor.ActorSystem$.apply(ActorSystem.scala:141) 
     at akka.actor.ActorSystem$.apply(ActorSystem.scala:118) 
     at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121) 
     at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54) 
     at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53) 
     at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1765) 
     at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) 
     at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1756) 
     at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56) 
     at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:33) 
     at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala) 
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: ec2-xx-xx-xx.compute-1.amazonaws.com/xx.xx.xx.xx:0 
     at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272) 
     at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:393) 
     at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:389) 
     at scala.util.Success$$anonfun$map$1.apply(Try.scala:206) 
     at scala.util.Try$.apply(Try.scala:161) 
     at scala.util.Success.map(Try.scala:206) 
kindly suggest 

Thanks`enter code here` 
+0

嘗試增加-Dspark.driver.port =(或任何其他端口)提交作業時 – 2015-03-02 19:50:43

回答

0

不可能綁定到端口0有/在你的火花配置錯誤。具體看

spark.webui.port 

它可能是設置爲0