Lapackpp vs Boost BLAS

for starters, I'm new to C ++.

I am writing a program for my master's thesis, part of which involves regression in a recursive way.

I would like to solve:

Ax = y 

In my case, the computation speed is not negligible, so I would like to know if Boost :: BLAS can use

 x = (A^TA)^{-1}A^Ty 

it will take less computation time and then Lapackpp (I use gentoo).

PS I was able to find the Lapackpp Class Class project site, but not examples. Can someone provide me some examples if Lapack is faster than Boost :: BLAS

thanks

+4
source share
4 answers

In terms of numerical analysis, you never want to write code that

  • Explicitly inverts the matrix or
  • Generates a matrix of normal equations ( A^TA ) for regression

Both of them work more and are less accurate (and probably less stable) than alternatives that solve the same problem directly.

Whenever you see any math showing the inverse of a matrix, it should be considered to mean "solve a system of linear equations" or multiply a matrix and use factorization to solve the system. Both BLAS and Lapack have routines for this.

Similarly, for regression, call the library function that calculates the regression, or read how to do it yourself. The normal equation method is not the right way for a textbook.

+4
source

Do you really need to implement with C ++? Would python / numpy, for example, be an alternative for you? And for recursive regression (least squares), I recommend looking at Professor Strang's MIT lectures on linear algebra and / or his books.

+3
source

High-level interface and low-level optimization are two different things.

LAPACK and uBLAS provide a high-level interface and non-optimized low-level implementation. Hardware optimized low-level routines (or bindings) must come from somewhere else. Once provided, LAPACK and uBLAS can use optimized low-level routines instead of their own non-optimized implementations.

For example, ATLAS provides optimized low-level procedures, but only a limited high-level interface (level 3 BLAS, etc.). You can bind ATLAS to LAPACK. LAPACK will then use ATLAS for low-level operation. Think of LAPACK as a senior manager who delegates technical work to experienced engineers (ATLAS). The same goes for uBLAS. You can link uBLAS and MKL. The result will be an optimization of the C ++ library. Check the documentation and use Google to figure out how to do this.

+2
source

Armadillo wraps BLAS and LAPACK in a nice C ++ interface and provides the following Matlab-like functions directly related to your problem:

  • solve () to solve a system of linear equations
  • pinv () , pseudo-inverse (which uses SVD internally)
+2
source

Source: https://habr.com/ru/post/1333955/


All Articles