External memory data structure for replacing card vector

I am doing an iterative calculation on a streaming network, during which I need to record how much each source contributes to the flow on each edge. The flow at any one edge is determined by an average of 2% of the sources, so I define vector< map<int, double> > flow , where flow[e][s] = f means that the flow to the edge e due to the source s is f . At each iteration, each f in flow updated.

Using peak program memory approaches 4 GB. This works on (32-bit) Linux and OS X, but it crashes on Windows (which apparently imposes 2 GB on each process ).

How can I implement a data structure on a disk with the interface vector< map<int, double> > (or else get around this problem)?

+6
source share
2 answers

I used STXXL for similar scenarios. Maybe worth a look.

+2
source

If the map vector is what consumes all the memory, is it absolutely necessary to have a double for the data fields? Change can help.

Otherwise, you could use a memory card, although making it cross-platform compatible will work a bit, especially with the built-in data structure for your mappings.

0
source

Source: https://habr.com/ru/post/918383/


All Articles