# Homework Help: Matrix Analysis (Functional Analysis) Question

1. Jun 1, 2012

### Combinatorics

1. The problem statement, all variables and given/known data
Let $\lambda_1 ,..., \lambda_n$ be the eigenvalues of an $nXn$ self-adjoint matrix A, written in increasing order.
Show that for any $m \leq n$ one has:
$\sum_{r=1}^{m} \lambda_r = min \{ tr(L) :dim(L) =m \}$ where $L$ denotes any linear subspace of $\mathbb {C} ^n$, and $tr(L):= \sum_{r=1}^{m} Q( \Phi_r)$ for some orthonormal basis $\{ \Phi _r \}$ of $L$.

(Q is the quadratic form associated with the inner product).

2. Relevant equations
3. The attempt at a solution
I really have no idea on how to start this.
On the one hand, I think the trace will always be equal to m, which means I'm probably getting it wrong...

Hope you'll be able to help me

2. Jun 1, 2012

### algebrat

I know perhaps next to nothing about functional analysis, but I'd like to try helping.

I think <phi,phi>=1, while Q(phi)=<A phi,phi>=lambda.

This is just a guess at the idea, I'm not sure where it would go from there.

3. Jun 1, 2012

### algebrat

Then maybe, since the lambdas are written in increasing order, the minimum trace over subspaces will somehow find you the sum of lower m lambdas.

4. Jun 1, 2012

### micromass

I feel the difficulty with this question is more notationalwise. Perhaps it would be a good idea to first try to prove it for m=1. The general idea is very similar.

So, we want to prove that

$$\sum_{r=1}^m \lambda_r = \min\{tr(L)~\vert~ \dim(L)=m\}$$

A good first step would be to show the existence of a subspace L with $\dim(L)=m$ such that

$$\sum_{r=1}^m \lambda_r = tr(L)$$

We know that a hermitian matrix A always has a orthonormal base of eigenvectors. Use this base of eigenvectors to find a suitable L.

5. Jun 2, 2012

### Combinatorics

Thanks a loth to you both!

Here is my attemp:
We have a Hermitian matrix A, which implies that we have an orthonormal base of eigenvectors to the entire space. Since we have $n$ eigenvalues, we must have $n$ eigenvectors, and by the assumption, we can choose form such that they'll form an orthonormal basis: $\{ v_1 ,..., v_n \}$ where $Av_i = \lambda_i v_i$ .
Now my guess was that the space L we need to choose is $span\{v_1,...,v_m \}$.

If the quadratic form is assumed to be the one defined by "algebrat" , then we'll indeed get the needed equality.
But how can I prove this is the minimum?

Thanks a lot again ! (and hope you'll be able to help me finish this)

6. Jun 2, 2012

### micromass

For the minimum case, perhaps it's better to first prove a special case to see what happens. So take m=1 and n=2.

So take a subspace L of dimension 1. This is generated by one element (of norm) 1. Call this element v.
Since there is an orthonormal basis of the entire space (call it {x,y}). We can write $v=ax+by$ for scalar a and b.

Now, what is $<Av,v>$?? Use that $v=ax+by$.

7. Jun 2, 2012

### Combinatorics

OK. Let's see: We have eigenvalues $\lambda_1 , \lambda_2$, with corresponding eigenvectors $x,y$.
We'll get: $<Av,v>= a^2 \lambda_1 + b^2 \lambda_2$ . What you mean is obviously that this expression is minimal when $a=1,b=0$ since $\lambda_1 \leq \lambda_2$.
But what if we take $b=0$ and a smaller $a$ ?

Thanks a lot again !

8. Jun 2, 2012

### micromass

Notice that v must have norm 1. So this places conditions on a and b.

9. Jun 2, 2012

### Combinatorics

Great ! It implies that a+b=1 !

Then we're done!

Thanks a lot !

I'll try generlize this idea. If I won't succeed I'll reply here agin

Thanks again !

10. Jun 2, 2012

### micromass

That $a^2+b^2=1$.

Anyway, the general case is quite similar.