Estimating singular values from QR decomposition

  • Context: Graduate 
  • Thread starter Thread starter vibe3
  • Start date Start date
  • Tags Tags
    Decomposition
Click For Summary
SUMMARY

This discussion focuses on estimating the largest and smallest singular values of a matrix A using its QR decomposition, particularly in cases where A is ill-conditioned. The key finding is that the singular values of A are equivalent to those of R, the upper triangular matrix from the QR decomposition, allowing for a more computationally efficient approach. The relationship is established through the equation |∏_i r_{ii}| = ∏_i σ_i, where r_{ii} are the diagonal entries of R. The discussion emphasizes the use of the power method for estimating eigenvalues, which is suitable for ill-conditioned matrices.

PREREQUISITES
  • Understanding of QR decomposition and its properties
  • Knowledge of singular value decomposition (SVD) and its applications
  • Familiarity with eigenvalues and eigenvectors
  • Experience with numerical stability in matrix computations
NEXT STEPS
  • Research the power method for eigenvalue estimation in ill-conditioned matrices
  • Study the implications of QR decomposition on singular values
  • Explore numerical stability techniques for matrix decompositions
  • Learn about the relationship between SVD and eigenvalues in detail
USEFUL FOR

Mathematicians, data scientists, and engineers working with matrix computations, particularly those dealing with ill-conditioned matrices and requiring efficient numerical methods for singular value estimation.

vibe3
Messages
39
Reaction score
1
I have a matrix for which I know its QR decomposition: A = QR. I want to estimate the largest and smallest singular values of A (\sigma_1 and \sigma_n) however in my application it is too expensive to compute the full SVD of A.

Is it possible to estimate the largest/smallest singular values from the QR decomposition? The only result I've been able to find so far is
<br /> \left| \prod_i r_{ii} \right| = \prod_i \sigma_i<br />
where r_{ii} are the diagonal entries of R. I'm not sure if this implies that the singular values of R are the same as the singular values of A. If that's true, it might be possible and less expensive for my application to compute SVD(R) rather than SVD(A).
 
Physics news on Phys.org
The singular values of ##A## are the nonzero eigenvalues of ## A^H A ## or ## A A^H ##. The largest eigenvalue of a matrix can usually be computed using the power method with just a few iterations. The smallest eigenvalue of a full-rank matrix is the largest eigenvalue of the inverse, so again the power method can work; if you have hte QR the inverse is cheap to compute. If not full rank ... I'm not sure what clever methods there are.

jason
 
The problem is that my matrix A is very ill-conditioned, so computing eigenvalues from A^T A is unreliable (hence the reason I'm computing the QR decomposition instead). I'm looking for a numerically stable way to estimate the singular values from the QR decomposition for ill-conditioned matrices.

Suppose we have A = QR = U \Sigma V^T. Then suppose the SVD of R is R = U_r \Sigma_r V_r^T.

Because Q is orthogonal, A = Q U_r \Sigma_r V_r^T which implies U = Q U_r and V = V_r. This would then mean that \Sigma = \Sigma_r and so the singular values of R and A are the same. Is this correct logic?
 
Last edited:
jasonRF said:
The singular values of ##A## are the nonzero eigenvalues of ## A^H A ## or ## A A^H ##. The largest eigenvalue of a matrix can usually be computed using the power method with just a few iterations. The smallest eigenvalue of a full-rank matrix is the largest eigenvalue of the inverse, so again the power method can work; if you have hte QR the inverse is cheap to compute. If not full rank ... I'm not sure what clever methods there are.

jason
Did you meant to write that the singular values of A are the square root of the eigenvalues of A^HA ?
 
  • Like
Likes   Reactions: jasonRF
vibe3 said:
The problem is that my matrix A is very ill-conditioned, so computing eigenvalues from A^T A is unreliable (hence the reason I'm computing the QR decomposition instead). I'm looking for a numerically stable way to estimate the singular values from the QR decomposition for ill-conditioned matrices.

Suppose we have A = QR = U \Sigma V^T. Then suppose the SVD of R is R = U_r \Sigma_r V_r^T.

Because Q is orthogonal, A = Q U_r \Sigma_r V_r^T which implies U = Q U_r and V = V_r. This would then mean that \Sigma = \Sigma_r and so the singular values of R and A are the same. Is this correct logic?

Yes, this is mathematically sound. A and R would have the same singular values.
 
rs1n said:
Did you meant to write that the singular values of A are the square root of the eigenvalues of A^HA ?
oops. yep.
 

Similar threads

Replies
1
Views
2K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 1 ·
Replies
1
Views
6K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 11 ·
Replies
11
Views
6K