0

我正在使用HDP2.6。哪裏安裝oozie 4.2。和Spark2。在Oozie-Spark動作中添加多個罐子

我追蹤了本網站上的Hortonworks指南:https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_spark-component-guide/content/ch_oozie-spark-action.html在4.2中爲Spark2添加了庫。 Oozie版本。

後,我提交作業本附加:

oozie.action.sharelib.for.spark=spark2 

我得到的錯誤是這樣的:

2017-07-19 12:36:53,271 WARN SparkActionExecutor:523 - SERVER[] USER[admin] GROUP[-] TOKEN[] APP[Workflow2] JOB[0000012-170717153234639-oozie-oozi-W] ACTION[[email protected]_1] Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, Attempt to add (hdfs://:8020/user/oozie/share/lib/lib_20170613110051/oozie/aws-java-sdk-core-1.10.6.jar) multiple times to the distributed cache. 
    2017-07-19 12:36:53,275 WARN SparkActionExecutor:523 - SERVER[] USER[admin] GROUP[-] TOKEN[] APP[Workflow2] JOB[0000012-170717153234639-oozie-oozi-W] ACTION[[email protected]_1] Launcher exception: Attempt to add (hdfs://:8020/user/oozie/share/lib/lib_20170613110051/oozie/aws-java-sdk-core-1.10.6.jar) multiple times to the distributed cache. 
    java.lang.IllegalArgumentException: Attempt to add (hdfs://:8020/user/oozie/share/lib/lib_20170613110051/oozie/aws-java-sdk-core-1.10.6.jar) multiple times to the distributed cache. 
     at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$13$$anonfun$apply$8.apply(Client.scala:629) 
     at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$13$$anonfun$apply$8.apply(Client.scala:620) 
     at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74) 
     at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$13.apply(Client.scala:620) 
     at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$13.apply(Client.scala:619) 
     at scala.collection.immutable.List.foreach(List.scala:381) 
     at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:619) 
     at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:892) 
     at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:171) 
     at org.apache.spark.deploy.yarn.Client.run(Client.scala:1228) 
     at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1287) 
     at org.apache.spark.deploy.yarn.Client.main(Client.scala) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:497) 
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:745) 
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) 
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) 
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) 
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
     at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:311) 
     at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:232) 
     at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:58) 
     at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:62) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:497) 
     at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:239) 
     at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) 
     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) 
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) 
     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:422) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) 
     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164) 

我已閱讀,新Spark2不會與星火2.1工作(通過oozie無論如何),因爲Spark如何處理分佈式緩存中發現的多個文件,如下所述:see here

請記住,我正在使用Ambari和HDP2.6。我該如何處理這個問題?

回答

1

您需要檢查oozie目錄和spark2目錄的內容到Oozie sharelib中。如果兩個瓶子都存在,請將它們從一個地方移出並重試。另外,請執行oozie admin sharelub update命令來更新它。

希望這會幫助你。

+1

我試着刪除其中一個存在於'oozie'和'spark2'文件夾中的所有jar,但都沒有工作。但有一件事我沒有做的是在刪除它們並直接測試之後更新sharelib。我會嘗試更新sharelib後,你有什麼目的,並告訴你如何完成:) 謝謝btw :) –

+1

度假後,我終於嘗試了你說的,是的,現在的工作。非常感謝oyu :) –