# Rayleigh quotient

#### dirk_mec1

1. The problem statement, all variables and given/known data
Let A be a symmetric n x n - matrix with eigenvalues and orthonormal eigenvectors $$(\lambda_k, \xi_k)$$ assume ordening: $$\lambda_1 \leq...\leq \lambda_n$$

We define the rayleigh coefficient as:

$$R(x) = \frac{(Ax)^T x}{x^T x}$$

Show that the following constrained problem produces the second eigenvalue and its eigenvector:

$$min \left( R(X)| x \neq 0, x \bullet \xi_1 = 0 \right)$$

3. The attempt at a solution

In the first part of the exercise I was asked to proof that (without that inproduct being zero) the minimalisation produces the first eigenvalue. The idea was to use lagrange multipliers but I don't how to use it here.

Do I need to use Lagrange multipliers?

Last edited:

#### Dick

Homework Helper
Not really. The dot product condition tells you that x is ranging over linear combinations c_i*xi_i with c_1=0. It's just the same as the first problem with the first eigenvector thrown out.

#### dirk_mec1

Not really. The dot product condition tells you that x is ranging over linear combinations c_i*xi_i with c_1=0.
So if I understand correctly the eigenvectors are orthogonal to each other, right?

and so:

$$x= c_2 \cdot \xi_2+...+c_n \cdot \xi_n$$

It's just the same as the first problem with the first eigenvector thrown out.
So I just substitute the above expanded x?

Last edited:

#### Dick

Homework Helper
Yes, and minimize over c2,...,cn.

#### dirk_mec1

Yes, and minimize over c2,...,cn.
I get this:

$$\frac{\sum_{2=1}^n c_i^2\lambda_i}{\sum_{i=2}^n c_i^2}$$

but how do I prove that $$\lambda_2$$ is the minimum? I've tried putting the partial deratives to zeros and failed.

#### Dick

Homework Helper
"In the first part of the exercise I was asked to proof that (without that inproduct being zero) the minimalisation produces the first eigenvalue. The idea was to use lagrange multipliers but I don't how to use it here." I thought that meant that you proved the first part using lagrange multipliers. Did you skip that part? Because what you have now looks almost exactly like the first part. If you want to spell out a repetition of the proof of the first part, yes, use lagrange multipliers.

#### dirk_mec1

I thought that meant that you proved the first part using lagrange multipliers. Did you skip that part?
No I didn't skip it but I showed there that the minimizer should be an (orthogonal) eigenvector and upon substitution I get a $$min( \lambda_i)$$ from which the first eigenvalue results.

Because what you have now looks almost exactly like the first part. If you want to spell out a repetition of the proof of the first part, yes, use lagrange multipliers.
With the length of the vector c is one? So the problem is minimize:

$$\frac{ \lambda c^Tc}{c^Tc}$$ with $$\lambda$$ and c vectors.

#### Dick

Homework Helper
Yes, it's the same as the first problem. Just one dimension lower.

### The Physics Forums Way

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving