2016-09-26 69 views
0

Spark Spark RDD具有saveAsTxtFile功能。但是,我如何打開一個文件並向Hadoop商店寫入一個簡單的字符串?在Spark中,如何在沒有RDD的情況下在Hadoop上編寫文件?

val sparkConf: SparkConf = new SparkConf().setAppName("example") 
val sc: SparkContext = new SparkContext(sparkConf) 

sc.hadoopConfiguration.set("fs.s3n.awsAccessKeyId", "...") 
sc.hadoopConfiguration.set("fs.s3n.awsSecretAccessKey", "...") 

val lines: RDD[String] = sc.textFile("s3n://your-output-bucket/lines.txt") 
val lengths: RDD[Int] = lines.map(_.length) 
lengths.saveAsTextFile("s3n://your-output-bucket/lenths.txt") 

val numLines: Long = lines.count 
val resultString: String = s"numLines: $numLines" 
// how to save resultString to "s3n://your-output-bucket/result.txt" 

sc.stop() 

回答

1

爲什麼不做以下幾點?

val strings = sc.parallelize(Seq("hello", "there"), <numPartitions>) 
strings.saveAsTextFile("<path-to-file>") 

否則您可能需要查看hadoop API來編寫文件並從您的驅動程序顯式調用該代碼。

1

假設你有一個綁定到一個scSparkContext

import java.io.{BufferedWriter, OutputStreamWriter} 

val hdfs = org.apache.hadoop.fs.FileSystem.get(sc.hadoopConfiguration) 

val outputPath = 
    new org.apache.hadoop.fs.Path("hdfs://localhost:9000//tmp/hello.txt") 

val overwrite = true 

val bw = 
    new BufferedWriter(new OutputStreamWriter(hdfs.create(outputPath, overwrite))) 
bw.write("Hello, world") 
bw.close() 

注:爲了簡單起見,沒有代碼以關閉作家異常的情況。

+0

thx。我可以使用「s3n://your-output-bucket/result.txt」網址,而不是「hdfs:// localhost:9000 // tmp/hello.txt」嗎? –

相關問題