我有基於maven的可以提交spar作業的混合scala/java應用程序。我的應用程序jar「myapp.jar」在lib文件夾中有一些嵌套的jar。其中之一是「common.jar」。我在清單文件中定義了類路徑屬性,如Class-Path: lib/common.jar
。在紗線客戶端模式下提交應用程序時,Spark執行程序拋出java.lang.NoClassDefFoundError:com/myapp/common/myclass
錯誤。 Class(com/myapp/common/myclass.class)和jar(common.jar)在那裏並嵌套在我的主myapp.jar中。 Fat jar是使用spring-boot-maven插件創建的,該插件在父jar的lib文件夾內嵌入其他jar。我不想創建陰影平坦的罐子,因爲這會產生其他問題。無論如何spark執行器jvm可以在這裏加載嵌套的jar嗎?Apache spark - java.lang.NoClassDefFoundError
編輯 spark(jvm classloader)可以找到myapp.jar本身內部的所有類。即COM/MYAPP/abc.class,COM/MYAPP/xyz.class等
EDIT2火花執行類加載器也可以從嵌套罐子找到一些類,但它拋出NoClassDefFoundError的一些其他類相同的嵌套罐子! 這裏的錯誤:
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, host4.local): java.lang.NoClassDefFoundError: com/myapp/common/myclass
at com.myapp.UserProfileRDD$.parse(UserProfileRDDInit.scala:111)
at com.myapp.UserProfileRDDInit$$anonfun$generateUserProfileRDD$1.apply(UserProfileRDDInit.scala:87)
at com.myapp.UserProfileRDDInit$$anonfun$generateUserProfileRDD$1.applyUserProfileRDDInit.scala:87)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:249)
at org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:172)
at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:79)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:242)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:64)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.lang.ClassNotFoundException:
com.myapp.common.myclass
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 14 more
我提交myapp.jar與sparkConf.setJar(String[] {"myapp.jar"})
也嘗試設置它spark.yarn.executor.extraClassPath
編輯3 作爲一種變通方法,我提取myapp.jar和手動設置sparkConf.setJar(String[] {"myapp.jar","lib/common.jar"})
和錯誤消失了,但顯然我必須爲所有不希望的嵌套jar做到這一點。
誰downvoted它,請關心證明 – nir