2017-06-15 98 views
1

我有一些解析對象的快速節儉文件的代碼。此代碼在獨立運行時(在Spark環境之外)運行良好。但是,從Spark內運行時,我從org.iq80.snappy包中收到IllegalAccessError異常。有沒有其他人看到這個錯誤和/或你有什麼建議?任何指針讚賞。 詳細例外信息:spark可能與org.iq80.snappy衝突

Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.lang.IllegalAccessError: tried to access class org.iq80.snappy.BufferRecycler from class org.iq80.snappy.AbstractSnappyInputStream 
    at org.iq80.snappy.AbstractSnappyInputStream.<init>(AbstractSnappyInputStream.java:91) 
    at org.iq80.snappy.SnappyFramedInputStream.<init>(SnappyFramedInputStream.java:38) 
    at DistMatchMetric$1.call(DistMatchMetric.java:131) 
    at DistMatchMetric$1.call(DistMatchMetric.java:123) 
    at org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:1015) 
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) 
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) 
    at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
    at scala.collection.TraversableOnce$class.reduceLeft(TraversableOnce.scala:172) 
    at scala.collection.AbstractIterator.reduceLeft(Iterator.scala:1157) 
    at org.apache.spark.rdd.RDD$$anonfun$reduce$1$$anonfun$14.apply(RDD.scala:1011) 
    at org.apache.spark.rdd.RDD$$anonfun$reduce$1$$anonfun$14.apply(RDD.scala:1009) 
    at org.apache.spark.SparkContext$$anonfun$36.apply(SparkContext.scala:1951) 
    at org.apache.spark.SparkContext$$anonfun$36.apply(SparkContext.scala:1951) 
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) 
    at org.apache.spark.scheduler.Task.run(Task.scala:89) 
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
    at java.lang.Thread.run(Thread.java:745) 
+0

有同樣的問題,摧毀了目前還沒有解決方案,你有沒有找到解決辦法? –

回答

0

剛剛解決了同樣的問題。這是因爲spark env本身具有衝突的快速依賴性,所以spark中的執行將失敗,但本地執行將會工作。你應該使用你的構建系統來遮蔽org.iq80.snappy包,這應該可以解決你的問題。