2015-10-08 103 views
0

用Spark做我的第一步,我面臨着從應用程序代碼提交作業到集羣的問題。挖掘日誌,我注意到一些週期性WARN消息上主日誌:Spark 1.5.1獨立羣集 - 錯誤的Akka遠程配置?

15/10/08 13:00:00 WARN remote.ReliableDeliverySupervisor: Association with remote system [akka.tcp://[email protected]:64014] has failed, address is now gated for [5000] ms. Reason: [Disassociated] 

的問題是,IP地址,我們的網絡上不存在的,沒有任何地方配置。同樣錯誤的IP顯示在當它試圖執行任務的工人數(錯誤的IP傳送到--driver-URL):

15/10/08 12:58:21 INFO worker.ExecutorRunner: Launch command: "/usr/java/latest//bin/java" "-cp" "/path/spark/spark-1.5.1-bin-ha 
doop2.6/sbin/../conf/:/path/spark/spark-1.5.1-bin-hadoop2.6/lib/spark-assembly-1.5.1-hadoop2.6.0.jar:/path/spark/ 
spark-1.5.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar:/path/spark/spark-1.5.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.ja 
r:/path/spark/spark-1.5.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar:/path/hadoop/2.6.0//etc/hadoop/" "-Xms102 
4M" "-Xmx1024M" "-Dspark.driver.port=64014" "-Dspark.driver.port=53411" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" 
"akka.tcp://[email protected]:64014/user/CoarseGrainedScheduler" "--executor-id" "39" "--hostname" "192.168.10.214" "--cores" "16" "--app-id" "app-20151008123702-0003" "--worker-url" "akka.tcp://[email protected]:37625/user/Worker" 
15/10/08 12:59:28 INFO worker.Worker: Executor app-20151008123702-0003/39 finished with state EXITED message Command exited with code 1 exitStatus 1 

任何想法,我做錯了什麼,以及如何這個問題能解決?

Java版本是1.8.0_20,我使用預先構建的Spark二進制文件。

謝謝!

回答