Convert Array of String column to multiple columns

2020-02-06 04:47发布

问题:

I have a dataframe with following schema:

id         : int,
emp_details: Array(String)

Some sample data:

1, Array(empname=xxx,city=yyy,zip=12345)
2, Array(empname=bbb,city=bbb,zip=22345)

This data is there in a dataframe and I need to read emp_details from the array and assign it to new columns as below or if I can split this array to multiple columns with column names as empname,city and zip:

.withColumn("empname", xxx)
.withColumn("city", yyy)
.withColumn("zip", 12345)

Could you please guide how we can achieve this by using Spark (1.6) Scala.

Really appreciate your help...

Thanks a lot

回答1:

You can use withColumn and split to get the required data

df1.withColumn("empname", split($"emp_details" (0), "=")(1))
  .withColumn("city", split($"emp_details" (1), "=")(1))
  .withColumn("zip", split($"emp_details" (2), "=")(1)) 

Output:

+---+----------------------------------+-------+----+-----+
|id |emp_details                       |empname|city|zip  |
+---+----------------------------------+-------+----+-----+
|1  |[empname=xxx, city=yyy, zip=12345]|xxx    |yyy |12345|
|2  |[empname=bbb, city=bbb, zip=22345]|bbb    |bbb |22345|
+---+----------------------------------+-------+----+-----+

UPDATE:
If you don't have fixed sequence of data in array then you can use UDF to convert to map and use it as

val getColumnsUDF = udf((details: Seq[String]) => {
  val detailsMap = details.map(_.split("=")).map(x => (x(0), x(1))).toMap
  (detailsMap("empname"), detailsMap("city"),detailsMap("zip"))
})

Now use the udf

df1.withColumn("emp",getColumnsUDF($"emp_details"))
 .select($"id", $"emp._1".as("empname"), $"emp._2".as("city"), $"emp._3".as("zip"))
 .show(false)

Output:

+---+-------+----+---+
|id |empname|city|zip|
+---+-------+----+---+
|1  |xxx    |xxx |xxx|
|2  |bbb    |bbb |bbb|
+---+-------+----+---+

Hope this helps!