2015-12-09 30 views
1

的Tomcat版本:7.0.47Apache的火花引起Tomcat能夠正常關機

我有使用Apache星火一個Web應用程序。我的Web應用程序充當Apache火花驅動程序。

當遠程獨立火花集羣不可用,則星火上下文被關閉使用日誌org.apache.spark.util.Utils - Shutdown hook called

遲早發生這種情況的Tomcat也開始正常關閉。我在tomcat中看到的唯一日誌是[exec] Result: 50

當火花調用關閉鉤子時,Tomcat關閉的原因是什麼?

星火登錄

SLF4J: Class path contains multiple SLF4J bindings. 
SLF4J: Found binding in [jar:file:/data/downloads/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: Found binding in [jar:file:/data/downloads/spark-1.4.1-bin-hadoop2.6/lib/spark-examples-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
15/12/09 17:11:23 INFO SparkContext: Running Spark version 1.4.1 
15/12/09 17:11:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
15/12/09 17:11:24 WARN Utils: Your hostname, pesamara-mobl-vm1 resolves to a loopback address: 127.0.0.1; using 10.30.9.107 instead (on interface eth0) 
15/12/09 17:11:24 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 
15/12/09 17:11:25 INFO SecurityManager: Changing view acls to: pes 
15/12/09 17:11:25 INFO SecurityManager: Changing modify acls to: pes 
15/12/09 17:11:25 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(pes); users with modify permissions: Set(pes) 
15/12/09 17:11:26 INFO Slf4jLogger: Slf4jLogger started 
15/12/09 17:11:27 INFO Remoting: Starting remoting 
15/12/09 17:11:27 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:55740] 
15/12/09 17:11:27 INFO Utils: Successfully started service 'sparkDriver' on port 55740. 
15/12/09 17:11:27 INFO SparkEnv: Registering MapOutputTracker 
15/12/09 17:11:27 INFO SparkEnv: Registering BlockManagerMaster 
15/12/09 17:11:27 INFO DiskBlockManager: Created local directory at /tmp/spark-30d61b03-0b1c-4250-b68e-c2404c7884a8/blockmgr-3226ed7e-f8e5-40a2-bfb1-ffabb51cd0e0 
15/12/09 17:11:28 INFO MemoryStore: MemoryStore started with capacity 491.5 MB 
15/12/09 17:11:28 INFO HttpFileServer: HTTP File server directory is /tmp/spark-30d61b03-0b1c-4250-b68e-c2404c7884a8/httpd-7f2572c2-5677-446e-a80a-6f9d05ee2891 
15/12/09 17:11:28 INFO HttpServer: Starting HTTP Server 
15/12/09 17:11:28 INFO Utils: Successfully started service 'HTTP file server' on port 45047. 
15/12/09 17:11:28 INFO SparkEnv: Registering OutputCommitCoordinator 
15/12/09 17:11:28 INFO Utils: Successfully started service 'SparkUI' on port 4040. 
15/12/09 17:11:28 INFO SparkUI: Started SparkUI at http://10.30.9.107:4040 
15/12/09 17:11:29 INFO FairSchedulableBuilder: Created default pool default, schedulingMode: FIFO, minShare: 0, weight: 1 
15/12/09 17:11:29 INFO AppClient$ClientActor: Connecting to master akka.tcp://[email protected]:7077/user/Master... 
15/12/09 17:11:29 WARN AppClient$ClientActor: Could not connect to akka.tcp://[email protected]:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://[email protected]:7077 
15/12/09 17:11:29 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://[email protected]:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: localhost2: unknown error 
15/12/09 17:11:49 INFO AppClient$ClientActor: Connecting to master akka.tcp://[email protected]:7077/user/Master... 
15/12/09 17:11:49 WARN AppClient$ClientActor: Could not connect to akka.tcp://[email protected]:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://[email protected]:7077 
15/12/09 17:11:49 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://[email protected]:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: localhost2: unknown error 
15/12/09 17:12:09 INFO AppClient$ClientActor: Connecting to master akka.tcp://[email protected]:7077/user/Master... 
15/12/09 17:12:09 WARN AppClient$ClientActor: Could not connect to akka.tcp://[email protected]:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://[email protected]:7077 
15/12/09 17:12:09 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://[email protected]:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: localhost2: unknown error 
15/12/09 17:12:29 ERROR SparkDeploySchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up. 
15/12/09 17:12:29 WARN SparkDeploySchedulerBackend: Application ID is not initialized yet. 
15/12/09 17:12:29 INFO SparkUI: Stopped Spark web UI at http://10.30.9.107:4040 
15/12/09 17:12:29 INFO DAGScheduler: Stopping DAGScheduler 
15/12/09 17:12:29 INFO SparkDeploySchedulerBackend: Shutting down all executors 
15/12/09 17:12:29 INFO SparkDeploySchedulerBackend: Asking each executor to shut down 
15/12/09 17:12:29 ERROR OneForOneStrategy: 
java.lang.NullPointerException 
    at org.apache.spark.deploy.client.AppClient$ClientActor$$anonfun$receiveWithLogging$1.applyOrElse(AppClient.scala:160) 
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) 
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) 
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) 
    at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:59) 
    at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42) 
    at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118) 
    at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42) 
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465) 
    at org.apache.spark.deploy.client.AppClient$ClientActor.aroundReceive(AppClient.scala:61) 
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) 
    at akka.actor.ActorCell.invoke(ActorCell.scala:487) 
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) 
    at akka.dispatch.Mailbox.run(Mailbox.scala:220) 
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) 
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) 
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) 
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 
15/12/09 17:12:29 INFO AppClient$ClientActor: Connecting to master akka.tcp://[email protected]:7077/user/Master... 
15/12/09 17:12:29 WARN AppClient$ClientActor: Could not connect to akka.tcp://[email protected]:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://[email protected]:7077 
15/12/09 17:12:29 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://[email protected]:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: localhost2: unknown error 
15/12/09 17:12:29 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54184. 
15/12/09 17:12:29 INFO NettyBlockTransferService: Server created on 54184 
15/12/09 17:12:29 INFO BlockManagerMaster: Trying to register BlockManager 
15/12/09 17:12:29 INFO BlockManagerMasterEndpoint: Registering block manager 10.30.9.107:54184 with 491.5 MB RAM, BlockManagerId(driver, 10.30.9.107, 54184) 
15/12/09 17:12:29 INFO BlockManagerMaster: Registered BlockManager 
15/12/09 17:12:30 ERROR SparkContext: Error initializing SparkContext. 
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext 
    at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103) 
    at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1503) 
    at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2007) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:543) 
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61) 
    at org.apache.spark.examples.sql.SparkContextTest.main(SparkContextTest.java:32) 
15/12/09 17:12:30 INFO SparkContext: SparkContext already stopped. 
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext 
    at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103) 
    at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1503) 
    at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2007) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:543) 
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61) 
    at org.apache.spark.examples.sql.SparkContextTest.main(SparkContextTest.java:32) 
15/12/09 17:12:30 INFO DiskBlockManager: Shutdown hook called 
15/12/09 17:12:30 INFO Utils: path = /tmp/spark-30d61b03-0b1c-4250-b68e-c2404c7884a8/blockmgr-3226ed7e-f8e5-40a2-bfb1-ffabb51cd0e0, already present as root for deletion. 
15/12/09 17:12:30 INFO Utils: Shutdown hook called 
15/12/09 17:12:30 INFO Utils: Deleting directory /tmp/spark-30d61b03-0b1c-4250-b68e-c2404c7884a8/httpd-7f2572c2-5677-446e-a80a-6f9d05ee2891 
15/12/09 17:12:30 INFO Utils: Deleting directory /tmp/spark-30d61b03-0b1c-4250-b68e-c2404c7884a8 
+0

? – Sumit

+0

我正在使用Spark版本1.4.1 – era

+0

您是否可以打開Tomcat和Spark的調試日誌以發送完整跟蹤。 Spark有它自己的Shutdown hooks,無論如何都會執行它來停止Spark Context並清理資源。 – Sumit

回答

0

使用火花://主機名:7077火花主,而如果你是在地方的主機名的使用IP地址SparkConf設置主。

我遇到了同樣的問題,它通過在火花大師中使用主機名而不是使用IP地址來解決。

問候, Hokam

是否使用的是星火的版本
+0

我正在使用Spark conf的主機名。沒有運氣。 – era