首先,我想說我唯一看到解決這個問題的地方在這裏:Spark 1.6.1 SASL。但是,添加火花和紗線認證的配置時,它仍然不起作用。紗線集羣下面是我使用的火花配置火花提交關於亞馬遜的電子病歷:Spark SASL不能在紗線上工作
SparkConf sparkConf = new SparkConf().setAppName("secure-test");
sparkConf.set("spark.authenticate.enableSaslEncryption", "true");
sparkConf.set("spark.network.sasl.serverAlwaysEncrypt", "true");
sparkConf.set("spark.authenticate", "true");
sparkConf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
sparkConf.set("spark.kryo.registrator", "org.nd4j.Nd4jRegistrator");
try {
sparkConf.registerKryoClasses(new Class<?>[]{
Class.forName("org.apache.hadoop.io.LongWritable"),
Class.forName("org.apache.hadoop.io.Text")
});
} catch (Exception e) {}
sparkContext = new JavaSparkContext(sparkConf);
sparkContext.hadoopConfiguration().set("fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem");
sparkContext.hadoopConfiguration().set("fs.s3a.enableServerSideEncryption", "true");
sparkContext.hadoopConfiguration().set("spark.authenticate", "true");
請注意,我說的spark.authenticate到sparkContext的Hadoop配置代碼,而不是核心的site.xml(其我假設我可以做到這一點,因爲其他事情也起作用)。
看這裏:https://github.com/apache/spark/blob/master/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java它似乎都spark.authenticate的是必要的。當我運行這個應用程序時,我得到以下堆棧跟蹤。
17/01/03 22:10:23信息storage.BlockManager:向本地外部洗牌服務註冊執行程序。 17/01/03 22:10:23錯誤client.TransportClientFactory:178 ms後引導客戶端時發生異常 java.lang.RuntimeException:java.lang.IllegalArgumentException:未知消息類型:-22 at org.apache.spark。 network.shuffle.protocol.BlockTransferMessage $ Decoder.fromByteBuffer(BlockTransferMessage.java:67) at org.apache.spark.network。org.apache.spark.network.shuffle.ExternalShuffleBlockHandler.receive(ExternalShuffleBlockHandler.java:71) at org.apache.spark.network。在org.apache.spark.network.server.TransportChannelHandler.channelRead0(傳輸請求處理器) TransportChannelHandler.java:104) 在io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(TransportChannelHandler.java:51) (AbstractChannelHandlerContext.java:333) 在io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 在io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254) 在io.netty .channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) at io.netty.handler.codec.MessageToMessageDecoder.ch annelRead(MessageToMessageDecoder.java:103) 在io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 在io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 在org.apache。 spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86) 在io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 在io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java: 319) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787) at io.netty.channel.nio.AbstractNioByteChannel $ NioByteUnsafe.read(AbstractNioByteChannel.java:130) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) at io.netty.channel.nio。 (io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor $ 2.runJava的:116) 在java.lang.Thread.run(Thread.java:745)
火花的文檔,它說
For Spark on YARN deployments, configuring spark.authenticate to true will automatically handle generating and distributing the shared secret. Each application will use a unique shared secret.
這似乎是錯誤的基於紗線文件中的註釋
以上,但是在拍攝問題時,我仍然失去了應該去哪裏找工作的地方?我是否錯過了某處記載的明顯東西?