Fast approximate solution of linear equations?

I need to solve a system of N linear equations as an intermediate step in the numerical optimizer. AFAIK quite simple algorithms for this are O (N ^ 3) (although in some mathematical article I saw a very difficult task that could do this in the form of O (N ^ 2.8) with a huge constant). In some cases, N is huge, i.e. A few thousand.

Is there a good way to get a decent approximate solution to a system of linear equations less than O (N ^ 3)?

Edit:

Here are some details, if that helps at all.

  • My matrix is ​​symmetrical and not sparse.

  • This is the second derived matrix from Newton-Raphson. I'm trying to optimize something in a 2,000-dimensional space.

+4
source share
3 answers

Iterative methods exist, such as Jacobi, Gauss-Zeidel, cg, GMRES, etc.

+3
source

For a symmetric matrix, the conjugate gradient method is simple to implement and will beat most other iterative methods (e.g. Gauss-Seidel, SOR). The main cycle consists of matrix vector multiplication and several other vector operations.

Once you earn it, you can use preprocessing to further improve convergence.

+2
source

Yes, if the matrix that you get from their coefficients is sparse. There is the "Right lay" method (in Bulgarian, not sure about the exact translation), if you have a three-diagonal matrix, for example, which works in O (N). There are other algorithms that are still O (N ^ 3), but achieve incredible results if the matrix corresponds to some invariant that they need (sparse, diagonal, triangular, etc.).

If you are tied to a specific method based on your invariant, the only way to speed things up is multithreading.

Try this search.

0
source

Source: https://habr.com/ru/post/1338646/


All Articles