MHB Singular Values and Eigenvalues

  • Thread starter Thread starter linearishard
  • Start date Start date
  • Tags Tags
    Eigenvalues
Click For Summary
To prove that a matrix A has eigenvalues equal to its singular values if and only if it is symmetric positive definite, one must establish both the positive definiteness and symmetry of A. Positive definite matrices are typically assumed to be symmetric, and the eigenvalues of a symmetric matrix are real and positive. However, the claim that singular values equate to eigenvalues does not hold if A is not symmetric, as singular values can be zero, which contradicts the requirement for positive definiteness. The relationship between eigenvalues and singular values is valid when A is symmetric positive definite, as the eigenvalues of A are the square roots of those of A^TA. Understanding the Spectral Theorem is crucial in this context, as it confirms that every real symmetric matrix is diagonalizable.
linearishard
Messages
8
Reaction score
0
Hi, one more question!

How do I prove that A has eigenvalues equal to its singular values iff it is symmetric positive definite? I think I have the positive definite down but I can't figure out the symmetric part. Thanks!
 
Physics news on Phys.org
It will help if you post the work you already have.
 
All I have is that if a singular value is the eigenvalue of ATA, then A must be positive semi definite or the signs will be different on at least one eigenvalue. I don't know where to start with symmetry or if my assumption is correct.
 
Are you familiar with the Spectral Theorem?
That is that every real symmetric matrix is diagonalizable?

So if the matrix is symmetric and has real numbers as its elements, it is diagonalizable, which means that it has a full set of real eigenvalues and corresponding eigenvectors that span the vector space.
 
Hello again, linearishard,

Usually, positive definite matrices are assumed to be symmetric (or Hermitian for complex matrices). Does your teacher's definition of positive definite exclude the symmetry assumption? In any case, the problem statement is not true. For since singular values of a matrix can be zero, having eigenvalues of $A$ equal to the singular values of $A$ does not necessarily result in every eigenvalue being positive (which is what you need to claim positive definiteness).

The forward conditional is true, however. if $A$ is symmetric positive definite, the eigenvalues of $A$ are positive. The eigenvalues of $A$ are square roots of the eigenvalues of $A^2 = A^TA$, so the singular values of $A$ are the eigenvalues of $A$.
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
1K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 12 ·
Replies
12
Views
4K