[graph-tool] graph-tool memory usage
nikhrao at umich.edu
Tue Feb 9 00:12:34 CET 2021
Thanks for the response! Yes, sorry for the confusion. I think I have a key
misunderstanding (I haven't used python before). When you refer to
compilation, does python compile the code I write as I write it? Or rather,
is the code compiled once I decide to run it, and it's at this step that the
memory usage is high due to filtering?
For context, the networks that I'm working with have around 400,000 vertices
and 300,000 edges. If I tried fairly rudimentary tasks such as computing the
degree for each vertex or computing eigenvector centrality, would these
require large amounts of RAM (say, 90-100 GB) due the filtering capabilities
that graph-tool has? I'm just trying to get an idea of the upper bound of
memory requirements that I may need on my server.
Sent from: https://nabble.skewed.de/
More information about the graph-tool