I have a nested json structure loaded into a dataframe in spark. It contains several layers of arrays and I'm trying to figure out how to query this structure by values in the internal arrays.
Example: consider the following structure (directors.json file)
[
{
"director": "Steven Spielberg",
"films": [
{
"name": "E.T",
"actors": ["Henry Thomas", "Drew Barrymore"]
},
{
"name": "The Goonies",
"actors": ["Sean Astin", "Josh Brolin"]
}
]
},
{
"director": "Quentin Tarantino",
"films": [
{
"name": "Pulp Fiction",
"actors": ["John Travolta", "Samuel L. Jackson"]
},
{
"name": "Kill Bill: Vol. 1",
"actors": ["Uma Thurman", "Daryl Hannah"]
}
]
}
]
Lets say I want to run a query that will return all the films that a specific actor has participated in. something like this:
val directors = spark.read.json("directors.json")
directors.select($"films.name").where($"films.actors" === "Henry Thomas")
When I run this in the spark shell I get an exception:
org.apache.spark.sql.AnalysisException: cannot resolve '(`films`.`actors` = 'Henry Thomas')' due to data type mismatch: differing types in '(`films`.`actors` = 'Henry Thomas')' (array<array<string>> and string).;;
'Project [name#128]
+- 'Filter (films#92.actors = Henry Thomas)
+- AnalysisBarrier
+- Project [films#92.name AS name#128, films#92]
+- Relation[director#91,films#92] json
How do I properly make such a query?
Are there different alternatives? If So, what are the pros and cons?
Thanks
Edit
@thebluephantom this still doesn't work. getting similar exception. I think it's because I have an array within another array. This is the exception:
org.apache.spark.sql.AnalysisException: cannot resolve 'array_contains(`films`.`actors`, 'Henry Thomas')' due to data type mismatch: Arguments must be an array followed by a value of same type as the array members;;
'Filter array_contains(films#7.actors, Henry Thomas)
+- AnalysisBarrier
+- Project [director#6, films#7]
+- Relation[director#6,films#7] json