I have graph in Spark collected from different data sources. Is there simple way to export Spark GraphX graph to Gephi for visualization using scala? Any common data formats?
相关问题
- How to maintain order of key-value in DataFrame sa
- Unusual use of the new keyword
- Get Runtime Type picked by implicit evidence
- Spark on Yarn Container Failure
- What's the point of nonfinal singleton objects
相关文章
- Gatling拓展插件开发,check(bodyString.saveAs("key"))怎么实现
- Livy Server: return a dataframe as JSON?
- RDF libraries for Scala [closed]
- Why is my Dispatching on Actors scaled down in Akk
- How do you run cucumber with Scala 2.11 and sbt 0.
- GRPC: make high-throughput client in Java/Scala
- Setting up multiple test folders in a SBT project
- SQL query Frequency Distribution matrix for produc
As far as I am concerned the only way you can export graph directly is to use some variation of CSV. All other formats supported by Gephi cannot be easily written in parallel.
Problem with using basic CSV is that it doesn't support attributes. Since amount of data you can visualize using Gephi is rather limited a better approach could be to simply collect edges and vertices and create local file using a format that suits your needs, for example with gexf4j.
A mixed approach is to export data with properties to CSV file, import into Neo4j, and visualize using Neo4j plugin.