2016-02-10 19 views
1

我想知道我的Spark應用程序正在執行的配置參數。有沒有辦法獲得所有參數,包括默認參數?Spark:如何獲取所有配置參數

E.g.如果你執行「設置」;在Hive控制檯上,它會列出完整的Hive配置。我正在爲Spark尋找類似的動作/命令。

UPDATE: 我試過了karthik manchala提出的解決方案。我得到了這些結果。據我所知,這些不是全部參數。例如。這一個spark.shuffle.memoryFraction(以及更多)缺失。

scala> println(sc.getConf.getAll.deep.mkString("\n")); 
(spark.eventLog.enabled,true) 
(spark.dynamicAllocation.minExecutors,1) 
(spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS,...) 
(spark.repl.class.uri,http://...:54157) 
(spark.tachyonStore.folderName,spark-46d43c17-b0b3-4b61-a017-a186075849ca) 
(spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES,http://...) 
(spark.driver.host,...l) 
(spark.yarn.jar,local:/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/spark/lib/spark-assembly.jar) 
(spark.yarn.historyServer.address,http://...:18088) 
(spark.dynamicAllocation.executorIdleTimeout,60) 
(spark.serializer,org.apache.spark.serializer.KryoSerializer) 
(spark.authenticate,false) 
(spark.fileserver.uri,http://...:33681) 
(spark.app.name,Spark shell) 
(spark.dynamicAllocation.maxExecutors,30) 
(spark.dynamicAllocation.initialExecutors,3) 
(spark.ui.filters,org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter) 
(spark.driver.port,46781) 
(spark.shuffle.service.enabled,true) 
(spark.master,yarn-client) 
(spark.eventLog.dir,hdfs://.../user/spark/applicationHistory) 
(spark.app.id,application_1449242356422_80431) 
(spark.driver.appUIAddress,http://...:4040) 
(spark.driver.extraLibraryPath,/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/hadoop/lib/native) 
(spark.dynamicAllocation.schedulerBacklogTimeout,1) 
(spark.shuffle.service.port,7337) 
(spark.executor.id,<driver>) 
(spark.jars,) 
(spark.dynamicAllocation.enabled,true) 
(spark.executor.extraLibraryPath,/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/hadoop/lib/native) 
(spark.yarn.am.extraLibraryPath,/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/hadoop/lib/native) 

回答

0

你可以做到以下幾點:

sparkContext.getConf().getAll(); 
相關問題