Scipy.linalg.solve time complexity (LAPACK gesv) on a large matrix?

If I use scipy.linalg.solve (which, I believe, calls the gesv LAPACK function) on an unknown problem of ~ 12000 (with size of ~ 12000 sq., Dense, asymmetric matrix) on my workstation, I get a good answer in 10-15 minutes .

Just to explore the limits of the possible (note that I do not say β€œuseful”), I doubled the resolution of my main problem, which leads to the need to solve for ~ 50,000 unknowns. Although this will technically work on my workstation, as soon as I add a few dozen more swap copies, it would be wiser to use some HWs with ample RAM, and so I released it to the AWS EC2 quad-equalizer ... where it has been looking for the last 14 hours (hey, spotty specimens are cheap) and it's impossible to tell how far it goes.

Unfortunately, I don't know what the time complexity of the solvers is (my google-fu didn't help me with this). If it is O (N ^ 2), I would expect it to be done in about 4 hours; if it is O (N ^ 3), then perhaps it will be done in 16 hours. Of course, the interpretation of N as the number of unknowns - which is four times - the number of elements in the matrix increased by 16 times!

And tips that will help me determine if this has a chance to complete (project) in my life or did not get a grateful result!

Additional Information:

Sparse matrices are of little interest here (my matrix is ​​dense, and in any case scipy does not work with more than 2**31 non-zero elements, even on a 64-bit basis).

I am using Debian / Squeeze scipy on a Ubuntu 12.04 workstation on EC2. Obviously, both are 64-bit.

+4
source share
1 answer

The time complexity of the DGESV for the NxN matrix is ​​O (N ^ 3). See Table 3.13: http://www.netlib.org/lapack/lug/node71.html

+7
source

Source: https://habr.com/ru/post/1437020/


All Articles