1
我正在使用Spark RDD。我需要追加/連接兩個類型爲Set
的RDD。在Apache Spark中添加/連接兩個類型爲Set的RDD
scala> var ek: RDD[Set[Int]] = sc.parallelize(Seq(Set(7)))
ek: org.apache.spark.rdd.RDD[Set[Int]] = ParallelCollectionRDD[31] at parallelize at <console>:32
scala> val vi: RDD[Set[Int]] = sc.parallelize(Seq(Set(3,5)))
vi: org.apache.spark.rdd.RDD[Set[Int]] = ParallelCollectionRDD[32] at parallelize at <console>:32
scala> val z = vi.union(ek)
z: org.apache.spark.rdd.RDD[Set[Int]] = UnionRDD[34] at union at <console>:36
scala> z.collect
res15: Array[Set[Int]] = Array(Set(3, 5), Set(7))
scala> val t = visited++ek
t: org.apache.spark.rdd.RDD[Set[Int]] = UnionRDD[40] at $plus$plus at <console>:36
scala> t.collect
res30: Array[Set[Int]] = Array(Set(3, 5), Set(7))
我一直在使用兩個運營商,union
和++
嘗試。但是,它不會產生預期的結果。
Array(Set(3, 5), Set(7))
預期的結果應該是這樣的:
scala> val u = Set(3,5)
u: scala.collection.immutable.Set[Int] = Set(3, 5)
scala> val o = Set(7)
o: scala.collection.immutable.Set[Int] = Set(7)
scala> u.union(o)
res28: scala.collection.immutable.Set[Int] = Set(3, 5, 7)
任何人都可以給我的方向如何做到這一點