Estimating singular values from QR decomposition

  • Context: Graduate 
  • Thread starter Thread starter vibe3
  • Start date Start date
  • Tags Tags
    Decomposition
Click For Summary

Discussion Overview

The discussion revolves around estimating the largest and smallest singular values of a matrix A using its QR decomposition, particularly in the context of ill-conditioned matrices. Participants explore methods to achieve this estimation without computing the full singular value decomposition (SVD) of A, focusing on numerical stability and computational efficiency.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant suggests that the product of the diagonal entries of R from the QR decomposition relates to the singular values of A, but questions whether the singular values of R are the same as those of A.
  • Another participant explains that the singular values of A are the nonzero eigenvalues of A^H A or A A^H, and proposes using the power method to compute the largest and smallest eigenvalues, noting that this could be efficient if QR decomposition is available.
  • A participant expresses concern about the reliability of computing eigenvalues from A^T A due to A being ill-conditioned, seeking a stable method to estimate singular values from the QR decomposition instead.
  • There is a discussion on the relationship between the SVD of R and A, with one participant asserting that if A = QR and R has its own SVD, then the singular values of A and R are the same, which another participant confirms as mathematically sound.
  • Clarification is made regarding the definition of singular values in relation to eigenvalues, with a participant correcting a previous statement about the relationship between singular values and eigenvalues of A^H A.

Areas of Agreement / Disagreement

Participants generally agree on the mathematical relationships discussed, particularly regarding the equivalence of singular values between A and R. However, there remains uncertainty about the best methods for estimating singular values in the context of ill-conditioned matrices, indicating a lack of consensus on practical approaches.

Contextual Notes

The discussion highlights the limitations of relying on eigenvalue computations for ill-conditioned matrices and the potential need for alternative methods to ensure numerical stability when estimating singular values.

vibe3
Messages
39
Reaction score
1
I have a matrix for which I know its QR decomposition: A = QR. I want to estimate the largest and smallest singular values of A (\sigma_1 and \sigma_n) however in my application it is too expensive to compute the full SVD of A.

Is it possible to estimate the largest/smallest singular values from the QR decomposition? The only result I've been able to find so far is
<br /> \left| \prod_i r_{ii} \right| = \prod_i \sigma_i<br />
where r_{ii} are the diagonal entries of R. I'm not sure if this implies that the singular values of R are the same as the singular values of A. If that's true, it might be possible and less expensive for my application to compute SVD(R) rather than SVD(A).
 
Physics news on Phys.org
The singular values of ##A## are the nonzero eigenvalues of ## A^H A ## or ## A A^H ##. The largest eigenvalue of a matrix can usually be computed using the power method with just a few iterations. The smallest eigenvalue of a full-rank matrix is the largest eigenvalue of the inverse, so again the power method can work; if you have hte QR the inverse is cheap to compute. If not full rank ... I'm not sure what clever methods there are.

jason
 
The problem is that my matrix A is very ill-conditioned, so computing eigenvalues from A^T A is unreliable (hence the reason I'm computing the QR decomposition instead). I'm looking for a numerically stable way to estimate the singular values from the QR decomposition for ill-conditioned matrices.

Suppose we have A = QR = U \Sigma V^T. Then suppose the SVD of R is R = U_r \Sigma_r V_r^T.

Because Q is orthogonal, A = Q U_r \Sigma_r V_r^T which implies U = Q U_r and V = V_r. This would then mean that \Sigma = \Sigma_r and so the singular values of R and A are the same. Is this correct logic?
 
Last edited:
jasonRF said:
The singular values of ##A## are the nonzero eigenvalues of ## A^H A ## or ## A A^H ##. The largest eigenvalue of a matrix can usually be computed using the power method with just a few iterations. The smallest eigenvalue of a full-rank matrix is the largest eigenvalue of the inverse, so again the power method can work; if you have hte QR the inverse is cheap to compute. If not full rank ... I'm not sure what clever methods there are.

jason
Did you meant to write that the singular values of A are the square root of the eigenvalues of A^HA ?
 
  • Like
Likes   Reactions: jasonRF
vibe3 said:
The problem is that my matrix A is very ill-conditioned, so computing eigenvalues from A^T A is unreliable (hence the reason I'm computing the QR decomposition instead). I'm looking for a numerically stable way to estimate the singular values from the QR decomposition for ill-conditioned matrices.

Suppose we have A = QR = U \Sigma V^T. Then suppose the SVD of R is R = U_r \Sigma_r V_r^T.

Because Q is orthogonal, A = Q U_r \Sigma_r V_r^T which implies U = Q U_r and V = V_r. This would then mean that \Sigma = \Sigma_r and so the singular values of R and A are the same. Is this correct logic?

Yes, this is mathematically sound. A and R would have the same singular values.
 
rs1n said:
Did you meant to write that the singular values of A are the square root of the eigenvalues of A^HA ?
oops. yep.
 

Similar threads

Replies
1
Views
2K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 1 ·
Replies
1
Views
6K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 11 ·
Replies
11
Views
6K