2017-03-03 68 views
2

我有一個6個從屬的spark集羣。並且火花默認示例pi.py可以在我的環境中成功運行。當python運行spark示例kmeans時出現ClassNotFoundException

但運行spark默認示例kmeans.py時出現以下錯誤。

./bin/spark-submit --master spark://master_ip:7077 examples/src/main/python/mllib/kmeans.py data/mllib/kmeans_data.txt 2 

的錯誤信息是:

17/03/03 10:21:21 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 10.xx.xx.xx:42586 (size: 6.5 KB, free: 366.3 MB) 
17/03/03 10:21:22 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 7772374377312901948 
java.lang.ClassNotFoundException: org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages$RetrieveSparkAppConfig$ 
     at java.net.URLClassLoader.findClass(URLClassLoader.java:381) 
     at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 
     at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 
     at java.lang.Class.forName0(Native Method) 
     at java.lang.Class.forName(Class.java:348) 
     at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67) 
     at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620) 
     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521) 
     at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781) 
     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353) 
     at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018) 
     at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942) 
     at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808) 
     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353) 
     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373) 
     at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75) 
     at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108) 
     at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259) 
     at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) 
     at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308) 
     at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:258) 
     at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) 
     at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:257) 
     at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:578) 
     at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:563) 
     at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:159) 
     at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:107) 
     at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:119) 
     at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51) 
     at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) 
     at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) 
     at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) 
     at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) 
     at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846) 
     at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) 
     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) 
     at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) 
     at java.lang.Thread.run(Thread.java:745) 

示例kmeans.py由蟒寫入並在Python環境跑去。但是我有一個Java錯誤。在這個問題上的任何幫助?

+0

這種情況已通過刪除導致此錯誤的一個從站解決。我仍然不知道根本原因,因爲其他沒有mllib程序在集羣中工作正常。 – ybdesire

回答

1

運行到相同的錯誤 - 原來是奴隸和主人之間的版本差異。幾個細節:

  • 碩士(星火2.0.2)
  • SLAVE1(火花2.0.2)
  • SLAVE2(星火2.0.2/2.1.1星火)

簡單地說,這裏的奇怪:在Slave2上有兩個版本的火花。如果一個進程試圖單獨使用Spark 211,我會得到你的錯誤,並且作業不能完成(必須被殺死)。

相反,如果一個進程運行正確的202版本,一切工作正常。此外,如果202點火過程正在運行,211版本也可以無誤地運行(奇怪)。

基本上,我的「答案」是檢查各處的版本(包括Spark和Python)。希望有所幫助。

相關問題