how to merge two RDD to one RDD [duplicate]

2019-07-22 12:11发布

问题:

This question already has an answer here:

  • Concatenating datasets of different RDDs in Apache spark using scala 2 answers

Help ,I have two RDDs, i want to merge to one RDD.This is my code.

val us1 = sc.parallelize(Array(("3L"), ("7L"),("5L"),("2L")))
val us2 = sc.parallelize(Array(("432L"), ("7123L"),("513L"),("1312L")))

回答1:

Just use union:

val merged = us1.union(us2)

Documentation is here

Shotcut in Scala is:

val merged = us1 ++ us2


回答2:

You need the RDD.unionThese don't join on a key. Union doesn't really do anything itself, so it is low overhead. Note that the combined RDD will have all the partitions of the original RDDs, so you may want to coalesce after the union.

val x = sc.parallelize(Seq( (1, 3), (2, 4) ))
val y = sc.parallelize(Seq( (3, 5), (4, 7) ))
val z = x.union(y)
z.collect
res0: Array[(Int, Int)] = Array((1,3), (2,4), (3,5), (4,7))

API

def++(other: RDD[T]): RDD[T]

Return the union of this RDD and another one.

def++ API

def union(other: RDD[T]): RDD[T]

Return the union of this RDD and another one. Any identical elements will appear multiple times (use .distinct() to eliminate them).

def union API