2016-11-15 219 views
3
test.csv 
name,key1,key2 
A,1,2 
B,1,3 
C,4,3 

我想改變這個數據是這樣的(如數據集或RDD)火花 - 與階

whatIwant.csv 
name,key,newkeyname 
A,1,KEYA 
A,2,KEYB 
B,1,KEYA 
B,3,KEYB 
C,4,KEYA 
C,3,KEYB 

.csv數據分割我與讀取方法加載的數據。

val df = spark.read 
      .option("header", true) 
      .option("charset", "euc-kr") 
      .csv(csvFilePath) 

我可以加載每個數據集一樣(名稱,鍵1)或(名稱,密鑰2),以及他們的工會工會,但要做到這一點星星之火會議。 對此有何想法?


那些不工作。

val df2 = df.select(df("TAG_NO"), df.map { x => (x.getAs[String]("MK_VNDRNM"), x.getAs[String]("WK_ORD_DT")) }) 

val df2 = df.select(df("TAG_NO"), Seq(df("TAG_NO"), df("WK_ORD_DT"))) 
+0

你嘗試'從數據幀explode'功能? – Shankar

+0

nope。我會嘗試爆炸。謝謝:) –

+0

因爲key1和key2不在單列中,所以我認爲explode並不是正確的答案。 –

回答

2

這可以用explode完成和udf

scala> val df = Seq(("A", 1, 2), ("B", 1, 3), ("C", 4, 3)).toDF("name", "key1", "key2") 
df: org.apache.spark.sql.DataFrame = [name: string, key1: int ... 1 more field] 

scala> df.show 
+----+----+----+ 
|name|key1|key2| 
+----+----+----+ 
| A| 1| 2| 
| B| 1| 3| 
| C| 4| 3| 
+----+----+----+ 

scala> val explodeUDF = udf((v1: String, v2: String) => Vector((v1, "Key1"), (v2, "Key2"))) 
explodeUDF: org.apache.spark.sql.expressions.UserDefinedFunction = UserDefinedFunction(<function2>,ArrayType(StructType(StructField(_1,StringType,true), StructField(_2,StringType,true)),true),Some(List(StringType, StringType))) 

scala> df = df.withColumn("TMP", explode(explodeUDF($"key1", $"key2"))).drop("key1", "key2") 
df: org.apache.spark.sql.DataFrame = [name: string, TMP: struct<_1: string, _2: string>] 

scala> df = df.withColumn("key", $"TMP".apply("_1")).withColumn("new key name", $"TMP".apply("_2")) 
df: org.apache.spark.sql.DataFrame = [name: string, TMP: struct<_1: string, _2: string> ... 2 more fields] 

scala> df = df.drop("TMP") 
df: org.apache.spark.sql.DataFrame = [name: string, key: string ... 1 more field] 

scala> df.show 
+----+---+------------+ 
|name|key|new key name| 
+----+---+------------+ 
| A| 1|  Key1| 
| A| 2|  Key2| 
| B| 1|  Key1| 
| B| 3|  Key2| 
| C| 4|  Key1| 
| C| 3|  Key2| 
+----+---+------------+ 
+1

獲利!這與我的起源問題有點不同,但可以做到這一點。非常感謝 :) –