2016-11-11 119 views
1

我遠程啓動火花實例與:星火沒有啓動遠程實例

./spark-ec2 --key-pair=octavianKey4 --identity-file=octavianKey4.pem --region=eu-west-1 --zone=eu-west-1c launch my-instance-name --resume 

然後,我確信我可以在端口7077 TCP連接,我試圖用連接到它:

spark-shell --master spark://my-instance-name.eu-west-1.compute.amazonaws.com:7077 

當我這樣做,我得到:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 
16/11/11 20:09:52 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
16/11/11 20:09:53 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master my-instance-name.eu-west-1.compute.amazonaws.com:7077 
org.apache.spark.SparkException: Exception thrown in awaitResult 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) 
     at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
     at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167) 
     at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) 
     at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100) 
     at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108) 
     at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
     at java.lang.Thread.run(Thread.java:745) 
Caused by: java.io.IOException: Failed to connect to my-instance-name.eu-west-1.compute.amazonaws.com/52.210.171.38:7077 
     at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:228) 
     at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:179) 
     at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:197) 
     at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:191) 
     at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:187) 
     ... 4 more 
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: my-instance-name.eu-west-1.compute.amazonaws.com/52.210.171.38:7077 
     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 
     at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) 
     at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:257) 
     at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:291) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:628) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:552) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:466) 
     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:438) 
     at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) 
     at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) 
     ... 1 more 
16/11/11 20:10:12 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master my-instance-name.eu-west-1.compute.amazonaws.com:7077 
org.apache.spark.SparkException: Exception thrown in awaitResult 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) 
     at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
     at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167) 
     at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) 
     at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100) 
     at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108) 
     at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
     at java.lang.Thread.run(Thread.java:745) 
Caused by: java.io.IOException: Failed to connect to my-instance-name.eu-west-1.compute.amazonaws.com/52.210.171.38:7077 
     at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:228) 
     at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:179) 
     at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:197) 
     at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:191) 
     at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:187) 
     ... 4 more 
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: my-instance-name.eu-west-1.compute.amazonaws.com/52.210.171.38:7077 
     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 
     at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) 
     at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:257) 
     at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:291) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:628) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:552) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:466) 
     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:438) 
     at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) 
     at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) 
     ... 1 more 
16/11/11 20:10:32 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master my-instance-name.eu-west-1.compute.amazonaws.com:7077 
org.apache.spark.SparkException: Exception thrown in awaitResult 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) 
     at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
     at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167) 
     at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) 
     at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100) 
     at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108) 
     at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
     at java.lang.Thread.run(Thread.java:745) 
Caused by: java.io.IOException: Failed to connect to my-instance-name.eu-west-1.compute.amazonaws.com/52.210.171.38:7077 
     at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:228) 
     at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:179) 
     at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:197) 
     at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:191) 
     at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:187) 
     ... 4 more 
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: my-instance-name.eu-west-1.compute.amazonaws.com/52.210.171.38:7077 
     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 
     at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) 
     at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:257) 
     at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:291) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:628) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:552) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:466) 
     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:438) 
     at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) 
     at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) 
     ... 1 more 
16/11/11 20:10:52 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up. 
16/11/11 20:10:52 WARN StandaloneSchedulerBackend: Application ID is not initialized yet. 
16/11/11 20:10:52 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master 
16/11/11 20:10:53 ERROR SparkContext: Error initializing SparkContext. 
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem 
     at scala.Predef$.require(Predef.scala:224) 
     at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91) 
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:528) 
     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2309) 
     at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:843) 
     at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:835) 
     at scala.Option.getOrElse(Option.scala:121) 
     at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:835) 
     at org.apache.spark.repl.Main$.createSparkSession(Main.scala:101) 
     at $line3.$read$$iw$$iw.<init>(<console>:15) 
     at $line3.$read$$iw.<init>(<console>:42) 
     at $line3.$read.<init>(<console>:44) 
     at $line3.$read$.<init>(<console>:48) 
     at $line3.$read$.<clinit>(<console>) 
     at $line3.$eval$.$print$lzycompute(<console>:7) 
     at $line3.$eval$.$print(<console>:6) 
     at $line3.$eval.$print(<console>) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:497) 
     at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786) 
     at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047) 
     at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638) 
     at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637) 
     at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31) 
     at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19) 
     at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637) 
     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569) 
     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565) 
     at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807) 
     at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681) 
     at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395) 
     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38) 
     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
     at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) 
     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37) 
     at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105) 
     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920) 
     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) 
     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) 
     at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) 
     at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909) 
     at org.apache.spark.repl.Main$.doMain(Main.scala:68) 
     at org.apache.spark.repl.Main$.main(Main.scala:51) 
     at org.apache.spark.repl.Main.main(Main.scala) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:497) 
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) 
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) 
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) 
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) 
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem 
    at scala.Predef$.require(Predef.scala:224) 
    at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:528) 
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2309) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:843) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:835) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:835) 
    at org.apache.spark.repl.Main$.createSparkSession(Main.scala:101) 
    ... 47 elided 
<console>:14: error: not found: value spark 
     import spark.implicits._ 
      ^
<console>:14: error: not found: value spark 
     import spark.sql 
      ^
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 2.1.0-SNAPSHOT 
     /_/ 

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_60) 
Type in expressions to have them evaluated. 
Type :help for more information. 

所以,這似乎很矛盾:

1.引發了一些例外情況,說我無法連接。

2.儘管如此,火花外殼確實啓動。

3.However,在shell我不能用正常火花變量和方法,如sc

那麼,究竟是怎麼回事?

回答

0

1)儘量netstat -at | grep 7077和檢查主人是同一主機上運行

2)去激發用戶界面默認端口8080和檢查給出的--master一樣

+0

呃,我在雲端不同的機器上運行aws,所以主機不會在同一主機上運行 – octavian

+0

檢查防火牆是否連接火星主機 –

+0

不,不是這樣。 – octavian

0

您沒有提供任何證據證明主人實際上運行正常:最有可能的不是。

嘗試從自己的機器上運行

telnet my-instance-name.eu-west-1.compute.amazonaws.com 7077 

。你也可以看看該服務器上的火花日誌 - 明白爲什麼master未能正常啓動

-1

檢查遠程服務器上的火花版本,以及本地服務器從您所連接。

此錯誤主要是由於版本不匹配造成的。

+0

如果你確定,那可能就是答案。雖然一些細節將不勝感激。 –