2017-05-02 114 views
3

我試圖從CMD提交火花程序在Windows 10與下文提到的命令:星火無法刪除臨時目錄

spark-submit --class abc.Main --master local[2] C:\Users\arpitbh\Desktop\AmdocsIDE\workspace\Line_Count_Spark\target\Line_Count_Spark-0.0.1-SNAPSHOT.jar 

,但在運行此之後,我得到的錯誤:

17/05/02 11:56:57 INFO ShutdownHookManager: Deleting directory C:\Users\arpitbh\AppData\Local\Temp\spark-03f14dbe-1802-40ca-906c-af8de0f462f9 
17/05/02 11:56:57 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\arpitbh\AppData\Local\Temp\spark-03f14dbe-1802-40ca-906c-af8de0f462f9 
java.io.IOException: Failed to delete: C:\Users\arpitbh\AppData\Local\Temp\spark-03f14dbe-1802-40ca-906c-af8de0f462f9 
     at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1010) 
     at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65) 
     at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62) 
     at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) 
     at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) 
     at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62) 
     at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216) 
     at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188) 
     at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) 
     at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) 
     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1951) 
     at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188) 
     at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) 
     at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) 
     at scala.util.Try$.apply(Try.scala:192) 
     at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188) 
     at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178) 
     at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54) 

我也檢查過apache spark的JIRA,這個缺陷已經被標記解決了,但是沒有提到解決方案。請幫忙。

package abc; 

import org.apache.spark.SparkConf; 
import org.apache.spark.api.java.JavaRDD; 
import org.apache.spark.api.java.JavaSparkContext; 


public class Main { 

    /** 
    * @param args 
    */ 
    public static void main(String[] args) { 
     // TODO Auto-generated method stub 

     SparkConf conf =new SparkConf().setAppName("Line_Count").setMaster("local[2]"); 
     JavaSparkContext ctx= new JavaSparkContext(conf); 

     JavaRDD<String> textLoadRDD = ctx.textFile("C:/spark/README.md"); 
     System.out.println(textLoadRDD.count()); 
     System.getProperty("java.io.tmpdir"); 

    } 

} 
+0

歡迎來到StackOverflow。請看這裏[如何格式化代碼](https://stackoverflow.com/editing-help) –

+0

你能提供你的代碼嗎? –

+0

我以正確的格式更新了我的代碼。請檢查 –

回答

0

這可能是因爲你沒有SPARK_HOME或HUPA_HOME,允許在程序找到bin目錄winutils.exe實例化SparkContext。我發現,當我從

SparkConf conf = new SparkConf(); 
JavaSparkContext sc = new JavaSparkContext(conf); 

JavaSparkContext sc = new JavaSparkContext("local[*], "programname", 
System.getenv("SPARK_HOME"), System.getenv("JARS")); 

錯誤就走開了。