I have a spark data frame of the format org.apache.spark.sql.DataFrame = [user_key: string, field1: string]. When I use saveAsTextFile to save the file in hdfs results look like [12345,xxxxx]. I don't want the opening and closing bracket written to output file. if i used .rdd to convert into a RDD still the brackets are present in the RDD.
Thanks
Just concatenate the values and store strings:
import org.apache.spark.sql.functions.{concat_ws, col}
import org.apache.spark.sql.Row
val expr = concat_ws(",", df.columns.map(col): _*)
df.select(expr).map(_.getString(0)).saveAsTextFile("some_path")
Or even better use spark-csv
:
selectedData.write
.format("com.databricks.spark.csv")
.option("header", "false")
.save("some_path")
Another approach is to simply map
:
df.rdd.map(_.toSeq.map(_.toString).mkString(","))
and save afterwards.