2
所以當啓動火花外殼時我得到了-Dspark.io.compression.codec=org.apache.hadoop.io.compress.GzipCodec
。如何使用GzipCodec或BZip2Codec用Spark shell進行隨機溢出壓縮
由於我們的集羣空間有限,我想使用更積極的壓縮編解碼器,但是如何使用BZip2Codec並避免此異常?它甚至有可能嗎?
java.lang.NoSuchMethodException: org.apache.hadoop.io.compress.BZip2Codec.<init>(org.apache.spark.SparkConf)
at java.lang.Class.getConstructor0(Class.java:2810)
at java.lang.Class.getConstructor(Class.java:1718)
at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:48)
at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:42)
at org.apache.spark.broadcast.HttpBroadcast$.initialize(HttpBroadcast.scala:106)
at org.apache.spark.broadcast.HttpBroadcastFactory.initialize(HttpBroadcast.scala:70)
at org.apache.spark.broadcast.BroadcastManager.initialize(Broadcast.scala:81)
at org.apache.spark.broadcast.BroadcastManager.<init>(Broadcast.scala:68)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:175)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:141)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:956)
at $iwC$$iwC.<init>(<console>:8)
at $iwC.<init>(<console>:14)
at <init>(<console>:16)
at .<init>(<console>:20)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:772)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1040)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:609)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:640)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:604)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:795)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:840)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:752)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:119)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:118)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:258)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:118)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:55)
at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:912)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:140)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:55)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:102)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:55)
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:929)
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:883)
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:883)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:883)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:981)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
運行hadoop checknative
:
14/06/13 17:41:24 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native
14/06/13 17:41:24 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /usr/lib/hadoop/lib/native/libhadoop.so.1.0.0
zlib: true /lib/x86_64-linux-gnu/libz.so.1
snappy: true /usr/lib/hadoop/lib/native/libsnappy.so.1
lz4: true revision:99
bzip2: true /lib/x86_64-linux-gnu/libbz2.so.1
你檢查的Hadoop包含bzip2的圖書館嗎?你可以用'hadoop checknative'命令的結果更新你的答案嗎? – eliasah
已更新@eliasah – samthebest
它是版本衝突問題,您在羣集中使用hadoop版本嗎? – user1314742