Proving the Smallest Eigenvalue Property of Symmetric Positive Definite Matrices

Click For Summary
SUMMARY

The discussion focuses on proving that for any nxn symmetric positive definite matrix A, the expression min (x^TAx/x^Tx) equals the smallest eigenvalue of A, where x is a non-zero vector in Rn. Participants suggest starting with diagonal matrices to establish the property and emphasize the importance of eigenvalue expansion and orthogonal eigenvector bases. The theorem regarding the diagonalization of symmetric matrices is also highlighted as a crucial step in the proof process.

PREREQUISITES
  • Understanding of symmetric positive definite matrices
  • Familiarity with eigenvalues and eigenvectors
  • Knowledge of matrix diagonalization
  • Basic concepts of quadratic forms
NEXT STEPS
  • Study the diagonalization property of symmetric matrices
  • Learn about eigenvalue expansion techniques
  • Explore the implications of Rayleigh's quotient
  • Investigate the properties of quadratic forms in linear algebra
USEFUL FOR

Students and professionals in mathematics, particularly those studying linear algebra, matrix theory, and optimization, will benefit from this discussion.

Wildcat
Messages
114
Reaction score
0

Homework Statement



Let A be any nxn symmetric positive definite matrix. Show that (x‡0,xεRn)
x^TAx/x^Tx = the smallest eigenvalue of A.



Homework Equations





The Attempt at a Solution



Our hint was to first prove this for a diagonal matrix
For x^TAx/x^Tx I get L1x1² + L2x2² +...+Lnxn²/x1² + x2² +...+ xn² (I'm using L as lambda, the diagonal entries)
I know this is ≥1 since x1² + x2² +...+xn²/x1² + x2² +...+ xn² = 1 ≤ L1x1² + L2x2² +...+Lnxn²/x1² + x2² +...+ xn²
For the eigenvalues of A, If I choose x1, x2,.. to be 1 then xn = L1+L2+..L(n-1)/-Ln
I'm stuck here, help!
 
Physics news on Phys.org
how about noting a symmetric matrix has an orthogonal set of eignevectors, so have you tried expanding x in terms of the eigenvector basis?

also, as its poistive definite, you know it has eigenvalues > 0.
 
lanedance said:
how about noting a symmetric matrix has an orthogonal set of eignevectors, so have you tried expanding x in terms of the eigenvector basis?

also, as its poistive definite, you know it has eigenvalues > 0.

I'm not sure how to expand x in terms of the eigenvector basis. Would that mean that each of the xn^2 terms are =1?
 
so if ui are the eigenvectors you should be able to write the vector x as
x = a_1. u_1 + \ .. \ + a_i. u_i + \ ..

where, ai are scalars, to find them examine the dot product of x defined as above with a single eignevector, noting that the eignevectors are orthogonal
 
Something is wrong with your statement:

x^TAx/x^Tx = the smallest eigenvalue of A.

If x is an eigenvector to the eigenvalue a then x^TAx/x^Tx=a. So, you probably want to show that:

x^TAx/x^Tx\geq a_{min}
 
Wildcat said:

Homework Statement



Let A be any nxn symmetric positive definite matrix. Show that (x‡0,xεRn)
x^TAx/x^Tx = the smallest eigenvalue of A.

however I'm not quite convinced this is true... take x = (1,0,0,...) and A a diagonal matrix with diagonal elements (2,1,1,..) then
x^TAx/x^Tx = 2 and the smallest eigenvalue is 1?
 
lanedance said:
however I'm not quite convinced this is true... take x = (1,0,0,...) and A a diagonal matrix with diagonal elements (2,1,1,..) then
x^TAx/x^Tx = 2 and the smallest eigenvalue is 1?

I left an important piece of info out of my original problem. It should say
Show that (x‡0,xεRn)
min x^TAx/x^Tx = the smallest eigenvalue of A.
min is in front of x^TAx/x^Tx with (x‡0,xεRn) under it.
 
ok that makes more sense, try the eigenvalue expansion of a generic vector to show arkajad's inequality & minimise or use the eigenvector with smallest eigenvalue to demostrate equality
 
Or use the http://en.wikipedia.org/wiki/Symmetric_matrices" of symmetric matrices and check that your expression has the same value for the diagonal form of your matrix. The rest will be then very easy.
 
Last edited by a moderator:
  • #10
arkajad said:
Or use the http://en.wikipedia.org/wiki/Symmetric_matrices" of symmetric matrices and check that your expression has the same value for the diagonal form of your matrix. The rest will be then very easy.

I like very easy but I'm afraid I'm not on your level :( We have a theorem to use after we show that the original statement is true for a diagonal matrix. The Theorem states that since A is symmetric we can find an orthogonal matrix P and a diagonal matrix D such that P^TAP=D. Is this the diagonalization property? I think once I show the original statement is true for a diagonal matrix, I can apply the theorem to show its true for any nxn symmetric matrix.

I'm going to ask this because I'm just not sure what the min in front of the expression means? I've looked through my notes and can't find anything, it's probably something I should already know which is why its not in my notes. This is part of my problem, I'm still not sure about the eigenvalue expansion of a generic vector??
 
Last edited by a moderator:
  • #11
yeah so knowing that P^T = P^(-1) you can use P^T P = I

(x^TAx)/(x^Tx) = (x^T I A I x)/(x^TI x) = (x^T P^T P A P^T Px)/(x^T P^T P x)

now consider the vector u = P x, and with P A P^T = D, then you get
(x^TAx)/(x^Tx) = (u^TDu)/(u^Tu)
 
  • #12
And then

u^TDu=\sum_i \lambda_i |u_i|^2\geq\sum_i\lambda_{min} |u_i|^2=\lambda_{min}\,u^Tu
 

Similar threads

Replies
9
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
Replies
6
Views
4K
  • · Replies 3 ·
Replies
3
Views
6K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K