Rayleigh Quotient: Finding 2nd Eigenvalue & Vector

  • Thread starter Thread starter dirk_mec1
  • Start date Start date
  • Tags Tags
    quotient Rayleigh
Click For Summary
The discussion revolves around using the Rayleigh Quotient to find the second eigenvalue and its corresponding eigenvector of a symmetric matrix. Participants clarify that the constraint x · ξ₁ = 0 allows for minimizing the Rayleigh coefficient over linear combinations of the remaining eigenvectors. The conversation highlights that Lagrange multipliers are not necessary, as the problem simplifies to minimizing a function involving the second to nth eigenvalues. The approach involves substituting the expanded vector x and minimizing over the coefficients of the remaining eigenvectors. Ultimately, the goal is to demonstrate that the second eigenvalue is indeed the minimum under the given constraints.
dirk_mec1
Messages
755
Reaction score
13

Homework Statement


Let A be a symmetric n x n - matrix with eigenvalues and orthonormal eigenvectors (\lambda_k, \xi_k) assume ordening: \lambda_1 \leq...\leq \lambda_n

We define the rayleigh coefficient as:

<br /> R(x) = \frac{(Ax)^T x}{x^T x} <br />Show that the following constrained problem produces the second eigenvalue and its eigenvector:

<br /> min \left( R(X)| x \neq 0, x \bullet \xi_1 = 0 \right) <br />

The Attempt at a Solution



In the first part of the exercise I was asked to proof that (without that inproduct being zero) the minimalisation produces the first eigenvalue. The idea was to use lagrange multipliers but I don't how to use it here.

Do I need to use Lagrange multipliers?
 
Last edited:
Physics news on Phys.org
Not really. The dot product condition tells you that x is ranging over linear combinations c_i*xi_i with c_1=0. It's just the same as the first problem with the first eigenvector thrown out.
 
Dick said:
Not really. The dot product condition tells you that x is ranging over linear combinations c_i*xi_i with c_1=0.
So if I understand correctly the eigenvectors are orthogonal to each other, right?

and so:

x= c_2 \cdot \xi_2+...+c_n \cdot \xi_n
It's just the same as the first problem with the first eigenvector thrown out.
So I just substitute the above expanded x?
 
Last edited:
Yes, and minimize over c2,...,cn.
 
Dick said:
Yes, and minimize over c2,...,cn.
I get this:


<br /> \frac{\sum_{2=1}^n c_i^2\lambda_i}{\sum_{i=2}^n c_i^2} <br />

but how do I prove that \lambda_2 is the minimum? I've tried putting the partial deratives to zeros and failed.
 
"In the first part of the exercise I was asked to proof that (without that inproduct being zero) the minimalisation produces the first eigenvalue. The idea was to use lagrange multipliers but I don't how to use it here." I thought that meant that you proved the first part using lagrange multipliers. Did you skip that part? Because what you have now looks almost exactly like the first part. If you want to spell out a repetition of the proof of the first part, yes, use lagrange multipliers.
 
Dick said:
I thought that meant that you proved the first part using lagrange multipliers. Did you skip that part?
No I didn't skip it but I showed there that the minimizer should be an (orthogonal) eigenvector and upon substitution I get a min( \lambda_i) from which the first eigenvalue results.

Because what you have now looks almost exactly like the first part. If you want to spell out a repetition of the proof of the first part, yes, use lagrange multipliers.
With the length of the vector c is one? So the problem is minimize:

\frac{ \lambda c^Tc}{c^Tc} with \lambda and c vectors.
 
Yes, it's the same as the first problem. Just one dimension lower.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
Replies
4
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
2K
Replies
1
Views
1K
Replies
4
Views
6K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
3
Views
5K
Replies
8
Views
2K
Replies
11
Views
6K
  • · Replies 6 ·
Replies
6
Views
2K