If you want to write your own solver, in terms of simplicity, it is difficult to iterate through the Gauss-Seidel . The update step is one line, and it can be easily parallelized. Subsequent over-relaxation (SOR) is only a little more complicated and converges much faster.
Conjugate gradient is also easy to code and should converge much faster than other iterative methods. It is important to note that you do not need to form a complete matrix A, just calculate the matrix-vector products A * b. After that, you can improve the convergence rate by adding preconditions such as SSOR (Symmetric SOR).
Probably the fastest solution method that is wise to write is a Fourier-based solver. Essentially, this involves performing the FFT on the right-hand side, multiplying each value by a function of its coordinate, and adopting the inverse FFT. You can use the FFT library, such as FFTW , or flip your own.
A good reference for all of them is A First Course in Numerical Analysis of Differential Equations by Ari Aisers.
source share