2015-05-04 20 views
0

嘗試在Mesos羣集上運行Spark作業。嘗試獲取二進制文件時出錯。Mesos上的Spark - 無法獲取二進制文件

試圖保持二進制的:在奴隸

  1. HDFS
  2. 本地文件系統。

用於以下路徑SPARK_EXECUTOR_URI

文件系統路徑 - file://home/labadmin/spark-1.2.1.tgz

I0501 10:27:19.302435 30510 fetcher.cpp:214] Fetching URI 'file://home/labadmin/spark-1.2.1.tgz' 
Failed to fetch: file://home/labadmin/spark-1.2.1.tgz 
Failed to synchronize with slave (it's probably exited) 

沒有端口 - hdfs://ipaddress/spark/spark-1.2.1.tgz

HDFS路徑
0427 09:23:21.616092 4842 fetcher.cpp:214] Fetching URI 'hdfs://ipaddress/spark/spark-1.2.1.tgz' 
E0427 09:23:24.710765 4842 fetcher.cpp:113] HDFS copyToLocal failed: /usr/lib/hadoop/bin/hadoop fs -copyToLocal 'hdfs://ipaddress/spark/spark-1.2.1.tgz' '/tmp/mesos/slaves/20150427-054938-2933394698-5050-1030-S0/frameworks/20150427-054938-2933394698-5050-1030-0002/executors/20150427-054938-2933394698-5050-1030-S0/runs/5c13004a-3d8c-40a4-bac4-9c07249e1923/spark-1.2.1.tgz' 
copyToLocal: Call From sclq174.lss.emc.com/ipaddress to sclq174.lss.emc.com:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused 

與港口HDFS路徑50070- hdfs://ipaddress:50070/spark/spark-1.2.1.tgz

I0427 13:34:25.295554 16633 fetcher.cpp:214] Fetching URI 'hdfs://ipaddress:50070/spark/spark-1.2.1.tgz' 
E0427 13:34:28.438596 16633 fetcher.cpp:113] HDFS copyToLocal failed: /usr/lib/hadoop/bin/hadoop fs -copyToLocal 'hdfs://ipaddress:50070/spark/spark-1.2.1.tgz' '/tmp/mesos/slaves/20150427-054938-2933394698-5050-1030-S0/frameworks/20150427-054938-2933394698-5050-1030-0008/executors/20150427-054938-2933394698-5050-1030-S0/runs/2fc7886a-cfff-4cb2-b2f6-25988ca0f8e3/spark-1.2.1.tgz' 
copyToLocal: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: 

任何想法爲什麼它不工作?

+0

您是否在此節點和羣集的其餘部分之間設置了SSH密鑰? – Gillespie

回答

1

星火支持獲取二進制文件不同的方式:

  • file: - 絕對路徑和file:/的URI是由驅動程序的HTTP文件服務器提供服務,每一個執行者是直接從司機HTTP服務器的文件。
  • hdfs:http:https:ftp: - 這些從URI下拉文件和JAR預期
  • local: - 一個URI開始local:/預期存在,因爲每個工作節點上的本地文件

      驅動程序無法訪問
    1. file://home/labadmin/spark-1.2.1.tgz。您可能想使用local:/ URI。
    2. 有可能在sclq174.lss.emc.com:8020
    3. 的URI格式沒有運行HDFS服務器不被認可的Hadoop,你應該通過一個實際的IP地址,以便更換主機來完成這項工作,例如192.168.1.1:50070