我在3節點羣集上安裝了CDH 5.4.7。在Spark上運行第一個作業後,我檢查了應用程序歷史記錄頁面。它寫成如下未啓動Spark歷史記錄服務器
Event log directory: hdfs://****:8020/user/spark/applicationHistory
No completed applications found!
Did you specify the correct logging directory? Please verify your setting of
spark.history.fs.logDirectory and whether you have the permissions to access
it. It is also possible that your application did not run to completion or
did not stop the SparkContext.
我檢查了HDFS,發現/user/spark/applicationHistory
已經在那裏。但是該目錄內沒有條目。這意味着沒有寫入日誌。我搜索Cloudera的文檔頁面,發現文章Managing the Spark History Server
以下鏈接
如上所述我加了星火歷史服務器,並啓動它。執行以下兩條命令爲我的用戶
$ sudo -u hdfs hadoop fs -chown -R spark:spark /user/spark
$ sudo -u hdfs hadoop fs -chmod 1777 /user/spark/applicationHistory
然而,當我試圖執行以下命令它給no such file or directory
錯誤
$ cp /etc/spark/conf/spark-defaults.conf.template /etc/spark/conf/spark-defaults.conf
於是,我去了路徑/etc/spark
和上市文件內即。這表明像這樣
conf -> /etc/alternatives/spark-conf
無論我可以創建目錄命名conf
,因爲它已經在那裏我也不能更改目錄/etc/spark/conf
而且service spark-history-server start
命令給出unrecognized service
錯誤。
請幫忙!在此先感謝