In Scala I can flatten a collection using :
val array = Array(List("1,2,3").iterator,List("1,4,5").iterator)
//> array : Array[Iterator[String]] = Array(non-empty iterator, non-empty itera
//| tor)
array.toList.flatten //> res0: List[String] = List(1,2,3, 1,4,5)
But how can I perform similar in Spark ?
Reading the API doc http://spark.apache.org/docs/0.7.3/api/core/index.html#spark.RDD there does not seem to be a method which provides this functionality ?