我試圖執行與火花程序Oozie工作流作爲單一步驟。 我用它成功地執行JAR火花提交或火花外殼(同樣的代碼):Oozie工作流與火花應用程序報告內存不足
spark-submit --packages com.databricks:spark-csv_2.10:1.5.0 --master yarn-client --class "SimpleApp" /tmp/simple-project_2.10-1.1.jar
應用不應該要求大量的資源 - 加載一個CSV(< 10MB)採用蜂巢火花。
- 星火版本:1.6.0
- Oozie的版本:4.1.0
工作流程與色調,Oozie的工作流編輯器創建:
<workflow-app name="Spark_test" xmlns="uri:oozie:workflow:0.5">
<start to="spark-589f"/>
<kill name="Kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<action name="spark-589f">
<spark xmlns="uri:oozie:spark-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapreduce.map.java.opts</name>
<value>-XX:MaxPermSize=2g</value>
</property>
</configuration>
<master>yarn</master>
<mode>client</mode>
<name>MySpark</name>
<jar>simple-project_2.10-1.1.jar</jar>
<spark-opts>--packages com.databricks:spark-csv_2.10:1.5.0</spark-opts>
<file>/user/spark/oozie/jobs/simple-project_2.10-1.1.jar#simple-project_2.10-1.1.jar</file>
</spark>
<ok to="End"/>
<error to="Kill"/>
</action>
<end name="End"/>
</workflow-app>
我被運行後,下列日誌工作流程:
標準輸出:
Invoking Spark class now >>> Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], exception invoking main(), PermGen space
標準錯誤:
Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "Yarn application state monitor" Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], exception invoking main(), PermGen space
系統日誌:
2017-03-14 12:31:19,939 ERROR [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: PermGen space
請建議其配置參數應該增加。
Java 7自帶CDH 5.9.0,我不想改變它。用java 8不會出現這種情況嗎? – caruso