2016-06-13 23 views
1

我已經嘗試將SPARK_LOCAL_IP設置爲「127.0.0.1」並檢查端口是否被佔用。這裏是完整的錯誤文本:SparkR和Pyspark在啓動時拋出Java.net.Bindexception,但Spark-Shell不會?

Launching java with spark-submit command /usr/hdp/2.4.0.0- 

    169/spark/bin/spark-submit "sparkr-shell" /tmp/RtmpZo44il/backend_port998540c56917 
/usr/hdp/2.4.0.0-169/spark/bin/load-spark-env.sh: line 72: export: `load-spark-env.sh': not a valid identifier 
16/06/13 11:28:24 ERROR RBackend: Server shutting down: failed with exception 
java.net.BindException: Cannot assign requested address 
     at sun.nio.ch.Net.bind0(Native Method) 
     at sun.nio.ch.Net.bind(Net.java:433) 
     at sun.nio.ch.Net.bind(Net.java:425) 
     at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) 
     at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) 
     at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125) 
     at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485) 
     at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430) 
     at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415) 
     at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903) 
     at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198) 
     at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348) 
     at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) 
     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) 
     at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) 
     at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137) 
     at java.lang.Thread.run(Thread.java:745) 
Error in SparkR::sparkR.init() : JVM is not ready after 10 seconds 

上面的錯誤是當啓動./bin/sparkR。 Spark-shell將再次正常執行。

一些更多的信息。啓動時,Spark-shell將自動搜索端口,直到它解析出沒有綁定異常的端口。即使我將默認的SparkR後端端口設置爲未使用的端口,它也會失敗。

回答

0

我發現了這個問題。另一位用戶刪除了我的etc/hosts文件。我用localhost重新配置文件,似乎運行sparkR。儘管如此,我仍然很好奇spark-shell如何與文件一起運行。

相關問題