[graph-tool] Computation cost of creating a graph
gucko.gucko at googlemail.com
Sat Mar 22 10:28:06 CET 2014
Are there ready functions in graph-tool to help in reading links lists? I
tried loading a 4000 vertices graph and it took 60 secs to load it and it's
consuming 600MB. Is that normal? I'm not sure if my way of loading the
graph is the correct and common one that is used by everyone. I hope if
someone just can give it a look. I would be really thankful!
On Friday, March 21, 2014 5:03:33 PM UTC+1, Giuseppe Profiti wrote:
> if your primary concern is about speed/code efficiency, I strongly suggest
> you to use profilers, either the ones listed in
> http://docs.python.org/2/library/profile.html or something else. I
> personally like kernprof, you can find out more here
>> Sorry for asking too much questions today about efficiency but it seems
>> today is not my lucky day. I'm trying to build a graph by reading from a
>> text file. The graph is 1034 vertices with 53498 edges. Creating this graph
>> takes 6 seconds, so I was wondering if it truly takes that time or
>> something is inefficient in my way of creating the graph. I have a text
>> file that contains the edges in the format: vertex_label_x vertex_label_y.
>> What I'm doing basically is maintaining a dictionary that maps from
>> vertices labels in the text file to graph-tool vertices indices. When I
>> read the file, I check if a label is in the dictionary, if so I retrieve
>> its index from the dictionary. Otherwise I create a new vertex and store
>> its index by the corresponding label in the dictionary. Then I create the
>> edge using the indices of the source and target. I was wondering if my way
>> is the wrong way to do that.
>> This is my code: http://pastie.org/private/n9j7geizaosafmlimujca
>> Many thanks in advance!
>> graph-tool mailing list
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the graph-tool