2016-03-30 30 views
0

我正在使用spark-cassandra-connector運行一個spark應用程序。spark cassandra應用程序失敗,收到的信號15:SIGTERM

以下是我的火花提交選項

--class com.mobi.vserv.driver.Query5kPids1
--num執行人4
--executor-4G內存
- 執行-芯2
--driver-4G內存

但我不斷收到以下錯誤

16/03/30 11:57:07 ERROR executor.CoarseGrainedExecutorBackend:Driver 10.225.46.84:60637 disassociated!關機。

而且卡桑德拉獲取連接,然後被disconnecd

INFO Cluster: New Cassandra host /10.229.84.123:9042 added 

INFO LocalNodeFirstLoadBalancingPolicy: Added host 10.229.84.123 (us-east) 

INFO Cluster: New Cassandra host /10.229.19.210:9042 added -> This is Seed Node 
(This Message -> INFO LocalNodeFirstLoadBalancingPolicy: Doesnt show for Seed Node) 

INFO Cluster: New Cassandra host /10.95.215.249:9042 added 

INFO LocalNodeFirstLoadBalancingPolicy: Added host 10.95.215.249 (us-east) 

INFO Cluster: New Cassandra host /10.43.182.167:9042 added 

INFO LocalNodeFirstLoadBalancingPolicy: Added host 10.43.182.167 (us-east) 

INFO Cluster: New Cassandra host /10.155.34.67:9042 added 

INFO LocalNodeFirstLoadBalancingPolicy: Added host 10.155.34.67 (us-east) 

INFO Cluster: New Cassandra host /10.237.235.209:9042 added 

INFO LocalNodeFirstLoadBalancingPolicy: Added host 10.237.235.209 (us-east) 

INFO CassandraConnector: Connected to Cassandra cluster: dmp Cluster 

INFO CassandraConnector: Disconnected from Cassandra cluster: dmp Cluster 

最後紗害死應用主

錯誤ApplicationMaster:接收信號15:SIGTERM

我還添加了

--conf spark.yarn.executor.memoryOverhead = 1024
--conf spark.yarn.driver.memoryOverhead = 1024

但隨後的應用一直運行下去。

我不知道是什麼,似乎這裏是問題,因爲之前已經運行應用程序,並運行成功

使用POM是

<dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-core_2.10</artifactId> 
    <version>1.6.0</version> 
    </dependency> 

    <dependency> 
    <groupId>com.datastax.spark</groupId> 
    <artifactId>spark-cassandra-connector_2.10</artifactId> 
    <version>1.4.0-M1</version> 
    </dependency> 

    <dependency> 
    <groupId>com.datastax.cassandra</groupId> 
    <artifactId>cassandra-driver-core</artifactId> 
    <version>2.1.6</version> 
    </dependency> 

    <dependency> 
    <groupId>com.datastax.spark</groupId> 
    <artifactId>spark-cassandra-connector-java_2.10</artifactId> 
    <version>1.4.0-M1</version> 
    </dependency> 

回答

0

找到了解決辦法,因爲是用火花卡桑德拉的錯誤 - 這裏提到的連接器1.4.0-M1 https://datastax-oss.atlassian.net/browse/SPARKC-214

所以當我使用下一個版本,即1.4.0-M2。它運行良好。

但仍然是什麼似乎是最奇怪的是這已與1.4.0-M1早期工作。