In Spark 1.6.0 / Scala, is there an opportunity to get collect_list("colC")
or collect_set("colC").over(Window.partitionBy("colA").orderBy("colB")
?
相关问题
- How to maintain order of key-value in DataFrame sa
- Unusual use of the new keyword
- Get Runtime Type picked by implicit evidence
- Spark on Yarn Container Failure
- What's the point of nonfinal singleton objects
相关文章
- Gatling拓展插件开发,check(bodyString.saveAs("key"))怎么实现
- Livy Server: return a dataframe as JSON?
- RDF libraries for Scala [closed]
- Why is my Dispatching on Actors scaled down in Akk
- How do you run cucumber with Scala 2.11 and sbt 0.
- GRPC: make high-throughput client in Java/Scala
- Setting up multiple test folders in a SBT project
- SQL query Frequency Distribution matrix for produc
Given that you have
dataframe
asYou can
Window
functions by doing the followingResult:
Similar is the result for
collect_set
as well. But the order of elements in the finalset
will not be in order as withcollect_list
If you remove
orderBy
as belowresult would be
I hope the answer is helpful