In a DataFrame object in Apache Spark (I'm using the Scala interface), if I'm iterating over its Row objects, is there any way to extract structure values by name?
I am using the below code to extract by name but I am facing problem on how to read the struct value .
If values had been of type string then we could have done this:
val resultDF=joinedDF.rdd.map{row=>
val id=row.getAs[Long]("id")
val values=row.getAs[String]("slotSize")
val feilds=row.getAs[String](values)
(id,values,feilds)
}.toDF("id","values","feilds")
But in my case values has the below schema
v1: struct (nullable = true)
| |-- level1: string (nullable = true)
| |-- level2: string (nullable = true)
| |-- level3: string (nullable = true)
| |-- level4: string (nullable = true)
| |-- level5: string (nullable = true)
What shall I replace this line with to make the code work given that value has the above structure.
row.getAs[String](values)