Low rank matrix calculation

Given that I have a 3x3 matrix A of rank 3. I want to create a matrix of rank 2, which is closest to A in the normal $ {l} _ {2} $ / Frobenius. Let me call this matrix F.

It is easy to achieve SVD, namely, if $ A = US {V} ^ {H} $ by decomposing SVD $ F = U \ hat {S} {V} ^ {H} $. Where $ \ hat {S} $ matches $ S $ with the last Singular Value zeroed.

The question is, is there a less computationally intensive method for creating F, but using SVD decomposition?

Thanks.

+4
source share
3 answers

If you KNOW the matrix, which will be of rank 3, then exactly 3 Domholder transformations will be sufficient to reduce the matrix nxm to the matrix 3xm. Now you can easily convert this into a task for eigenvalues. Calculate the characteristic polynomial. For example, consider this matrix (I will use MATLAB to do this job):

>> A = rand(10,3); >> B = A*rand(3,6); 

It is clear that B will have rank 3, but if you do not trust me, the rank confirms this statement.

 >> rank(B) ans = 3 >> size(B) ans = 10 6 

So, B is a 10x6 matrix with rank 3. SVD confirms this fact.

 >> svd(B) ans = 6.95958965358531 1.05904552889497 0.505730235622534 7.37626877572817e-16 2.36322066654691e-16 1.06396598411356e-16 

I feel too lazy to write Domaholder transformations. (I have a code for this, but ...) So, I will use QR to help me.

 >> [Q,R] = qr(B,0); >> C = Q(:,1:3)'*B C = -2.0815 -1.7098 -3.7897 -1.6186 -3.6038 -3.0809 0.0000 0.91329 0.78347 0.44597 -0.072369 0.54196 0.0000 0.0000 -0.2285 -0.43721 -0.85949 -0.41072 

The multiplication here shows what we would see after the three transformations of the Domaholder. Look that it is triangular, as you would expect. Then we calculate the characteristic polynomial. I will be so symbolic here using my own tools, but computing is just a bit of algebra.

 >> sympoly lambda >> det(C*C' - lambda*eye(3)) ans = 13.8942 + 66.9996*lambda + 49.8132*lambda^2 + lambda^3 >> P = det(C*C' - lambda*eye(3)) P = 13.8942 - 66.9996*lambda + 49.8132*lambda^2 - lambda^3 

What are the roots of P, so the eigenvalues โ€‹โ€‹of C * C '?

 >> r = roots(P) r = 48.436 1.1216 0.25576 

We know that eigenvalues โ€‹โ€‹should be squares of singular values โ€‹โ€‹here, so a good test here is to see if we restored the unit values โ€‹โ€‹found by svd. Therefore, expanding the display format, we see that he did it beautifully.

 >> sqrt(roots(P)) ans = 6.95958965358531 1.05904552889497 0.505730235622533 

Math can be fun when it works. What can we do with this information? If I know a specific eigenvalue, I can calculate the corresponding eigenvector. Essentially, we need to solve a linear homogeneous system of 3x3 equations

 (C*C' - eye(3)*r(3)) * X = 0 

Again, I will be lazy here to find a solution without writing any code. A Gaussian exception would do that, though.

 >> V = null((C*C' - eye(3)*r(3))) V = -0.171504758161731 -0.389921448437349 0.904736084157367 

So we have V, the eigenvector C * C '. We can convince ourselves that this is one of the following svd calls.

 >> svd(C - V*(V'*C)) ans = 6.9595896535853 1.05904552889497 2.88098729108798e-16 

So, subtracting that component C in the direction V, we get a matrix of rank 2.

Similarly, we can transform V into the original problem space and use it to transform the matrix B, our original matrix, by updating rank 1.

 >> U = Q(:,1:3)*V; >> D = B - U*(U'*B); 

What is rank D?

 >> svd(D) ans = 6.95958965358531 1.05904552889497 2.62044567948618e-15 3.18063391331806e-16 2.16520878207897e-16 1.56387805987859e-16 >> rank(D) ans = 2 

As you can see, although I did math many times, knowing svd, QR, rank, etc., in the end, the actual calculation was relatively trivial. I only needed to ...

  • 3 homeowner transformations. (Save them for future reference. Note that the homeowner transformation is just updating the first rank of your matrix.)
  • Calculate the characteristic polynomial.
  • Calculate the smallest root using your favorite method for a cubic polynomial.
  • Recover the corresponding eigenvector. A Gaussian elimination is enough here.
  • Return this eigenvector to its original space using the homeowner transforms we originally made.
  • Update the first rank to the matrix.

All these computational steps will be fast and efficient for the ANY size matrix if we know that the actual rank is 3. Even there is no worth article on this subject.

EDIT:

Since the question was revised so that the matrix itself is 3 ร— 3 in size, the calculation is even more trivial. However, I will leave the first part of the message, since it describes a fully operational solution for a general matrix of any size with rank 3.

If your goal is to kill the last singular value of a 3x3 matrix, svd at 3x3 seems to be quite effective. There will also be some loss of accuracy when creating the last singular value by indirect means. For example, compare here the smallest singular value computed by svd and then using eigenvalues. So you can see a small error here, since the formation of A '* A will lead to some accuracy. The extent of this loss is likely to depend on condition number A.

 >> A = rand(3); >> sqrt(eig(A'*A)) ans = 0.0138948003095069 0.080275195586312 1.50987693453097 >> svd(A) ans = 1.50987693453097 0.0802751955863124 0.0138948003095054 

However, if you really want to do this work yourself, you can do it.

  • Calculate the characteristic polynomial directly from the 3x3 A '* A.
  • Calculate the roots of this cubic polynomial. Choose the smallest root.
  • Create the appropriate eigenvector.
  • Update the first rank to A, destroying the part of A that lies in the direction of this eigenvector.

Is this simpler or less computationally efficient than a simple svd call followed by an upgrade of the first rank? I'm not at all sure that it is worth the effort at 3x3. 3x3 svd calculates very fast.

+5
source

You probably already thought that, but if A is normal, SVD can be calculated by expanding by eigenvalues. This reduces to solving the characteristic polynomial, and since it is a polynomial of degree 3 for a matrix of rank 3, the roots are found by well-known cubic formulas.

It also looks like SVD should generally be reducible for solving a cube for a rank 3 matrix, but I donโ€™t remember anything reading anything about it. The quick google-around has turned this piece of code , which claims to allow 3rd rank SVD in this way. Unfortunately, there is no accompanying paper. If you use this code, testing it with a non-positive definite matrix should be a good test case.

Edit: In the second reading, it seems that the author there also uses eigendecomposition. Most likely, this is not good for non-PD matrices, but I would like it to be proved here that it is not.

0
source

You can use the power iteration to find particular vectors corresponding to the largest singular value. After that, you can use the modified version of the power iteration to search for vectors for the second largest singular value; after each iteration, you need to subtract the projection of the partial solution vector onto the vectors for the largest singular value.

This is a bad approach for finding all singular vectors, but should work fine for the first two.

0
source

Source: https://habr.com/ru/post/1395172/


All Articles