I have a graph with 690,024 nodes and 7,398,042 edges and want to visualize it. I have used graphviz's dot and gephi but both terminates because of low memory. Is there any way to fix this issue?
相关问题
- How to determine +/- sign when calculating diagona
- Seaborn HeatMap - How to set colour grading throug
- Direction of tick marks in matplotlib
- Adjacency list with O(1) look up time using HashSe
- Broken axis in Google charts
相关文章
- Mercurial Commit Charts / Graphs [closed]
- Android Visualizer class throwing runtime exceptio
- Sankey diagrams in Python
- Java: Traveling Salesman - Found polynomial algori
- Generating a Voronoi Diagram around 2D Polygons
- How to calculate the best price [duplicate]
- How to display the path of a Decision Tree for tes
- How to plot CCDF graph on a logarithmic scale?
Gephi has a limit based on amount of memory allocated to it in JVM. Check this http://gephi.org/users/requirements/ to make changes.
I would suggest to use Stanford Network Analysis Platform (SNAP) .
The core SNAP library is written in C++ and optimized for maximum performance and compact graph representation.
It easily scales to massive networks with hundreds of millions of nodes, and billions of edges. It efficiently manipulates large graphs, calculates structural properties, generates regular and random graphs, and supports attributes on nodes and edges. Besides scalability to large graphs, an additional strength of SNAP is that nodes, edges and attributes in a graph or a network can be changed dynamically during the computation.
We've been building http://www.github.com/graphistry/pygraphistry to enable doing this from most browsers and notebooks. The idea is to use WebGL to render the big graphs (pan/zoom/etc.), and offload most of the real-time compute (layout, filter, etc.) to a GPU cloud. It's similar to Gephi and Cytoscape, but with more of a focus on scaling to big graphs and streamlining data analysis, and integrating into web apps and notebook environments.
You may want to consider Hive plots if you need to visualize relationships among all edges for a graph of this size. If you need to construct a graph where nodes and edges are displayed, you may want to try igraph, which has a Python and R interface. I've constructed some very large graphs using igraph through R, but I can't recall the memory requirements (this would probably be dependent on the data anyway).
LGL used to work in such case for static visualizations - at least some years ago :)
http://lgl.sourceforge.net/ http://www.ncbi.nlm.nih.gov/pubmed/15184029?dopt=Abstract
I used it eg for a figure: https://static-content.springer.com/image/art%3A10.1186%2F1471-2105-7-276/MediaObjects/12859_2006_Article_1015_Fig2_HTML.jpg
in the paper: http://bmcbioinformatics.biomedcentral.com/articles/10.1186/1471-2105-7-276 with data having a comparable number of nodes and edges.
Try Tulip. I've downloaded the source and rebuilt, it's easy when you have installed Qt SDK.
edit I think that graphviz processor for large graphs should be sfdp, see the first discussion (loading very large graph failed)