dsetool狀態datastax - 無法連接到DSE資源管理器上的火花提交
DC: dc1 Workload: Cassandra Graph: no
======================================================
Status=Up/Down
|/ State=Normal/Leaving/Joining/Moving
-- Address Load Owns VNodes Rack Health [0,1]
UN 192.168.1.130 810.47 MiB ? 256 2a 0.90
UN 192.168.1.131 683.53 MiB ? 256 2a 0.90
UN 192.168.1.132 821.33 MiB ? 256 2a 0.90
DC: dc2 Workload: Analytics Graph: no Analytics Master: 192.168.2.131
=========================================================================================
Status=Up/Down
|/ State=Normal/Leaving/Joining/Moving
-- Address Load Owns VNodes Rack Health [0,1]
UN 192.168.2.130 667.05 MiB ? 256 2a 0.90
UN 192.168.2.131 845.48 MiB ? 256 2a 0.90
UN 192.168.2.132 887.92 MiB ? 256 2a 0.90
當我嘗試啓動火花提交工作
dse -u user -p password spark-submit --class com.sparkLauncher test.jar prf
我得到出現以下錯誤(編輯)
ERROR 2017-09-14 20:14:14,174 org.apache.spark.deploy.rm.DseAppClient$ClientEndpoint: Failed to connect to DSE resource manager
java.io.IOException: Failed to register with master: dse://?
....
Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: The method DseResourceManager.registerApplication does not exist. Make sure that the required component for that method is active/enabled
....
ERROR 2017-09-14 20:14:14,179 org.apache.spark.deploy.rm.DseSchedulerBackend: Application has been killed. Reason: Failed to connect to DSE resource manager: Failed to register with master: dse://?
org.apache.spark.SparkException: Exiting due to error from cluster scheduler: Failed to connect to DSE resource manager: Failed to register with master: dse://?
....
WARN 2017-09-14 20:14:14,179 org.apache.spark.deploy.rm.DseSchedulerBackend: Application ID is not initialized yet.
ERROR 2017-09-14 20:14:14,384 org.apache.spark.SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
ERROR 2017-09-14 20:14:14,387 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to start or submit Spark application
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
我可以證實,作爲本文檔中提到,我已授予的權限, https://docs.datastax.com/en/dse/5.1/dse-admin/datastax_enterprise/security/secAuthSpark.html 我在AWS上嘗試了這一點,如果這有所作爲,我可以確認節點之間的路由全部打開。 我可以從任何火花節點啓動火花外殼,可以調出Spark UI,可以從cqlsh命令獲取火花主人
任何指針都會有幫助,提前感謝!
@DataStax!我正在運行來自主節點的spark-submit,那是你所指的? – avinash
您可以從任何啓用分析的節點運行它。如果您仍然收到該消息,則表示Analytics模塊未運行。我會檢查系統日誌 – RussS