Matrix Analysis (Functional Analysis) Question

In summary, the minimum trace over subspaces of a hermitian matrix A is equal to the sum of the lower m lambdas.
  • #1
Combinatorics
36
5

Homework Statement


Let [itex] \lambda_1 ,..., \lambda_n [/itex] be the eigenvalues of an [itex] nXn[/itex] self-adjoint matrix A, written in increasing order.
Show that for any [itex] m \leq n [/itex] one has:
[itex] \sum_{r=1}^{m} \lambda_r = min \{ tr(L) :dim(L) =m \} [/itex] where [itex] L [/itex] denotes any linear subspace of [itex] \mathbb {C} ^n [/itex], and [itex] tr(L):= \sum_{r=1}^{m} Q( \Phi_r) [/itex] for some orthonormal basis [itex] \{ \Phi _r \} [/itex] of [itex] L [/itex].

(Q is the quadratic form associated with the inner product).

Homework Equations


The Attempt at a Solution


I really have no idea on how to start this.
On the one hand, I think the trace will always be equal to m, which means I'm probably getting it wrong...

Hope you'll be able to help me

Thanks in advance
 
Physics news on Phys.org
  • #2
I know perhaps next to nothing about functional analysis, but I'd like to try helping.

I think <phi,phi>=1, while Q(phi)=<A phi,phi>=lambda.

This is just a guess at the idea, I'm not sure where it would go from there.
 
  • #3
Then maybe, since the lambdas are written in increasing order, the minimum trace over subspaces will somehow find you the sum of lower m lambdas.
 
  • #4
I feel the difficulty with this question is more notationalwise. Perhaps it would be a good idea to first try to prove it for m=1. The general idea is very similar.

So, we want to prove that

[tex]\sum_{r=1}^m \lambda_r = \min\{tr(L)~\vert~ \dim(L)=m\}[/tex]

A good first step would be to show the existence of a subspace L with [itex]\dim(L)=m[/itex] such that

[tex]\sum_{r=1}^m \lambda_r = tr(L)[/tex]

We know that a hermitian matrix A always has a orthonormal base of eigenvectors. Use this base of eigenvectors to find a suitable L.
 
  • #5
micromass said:
I feel the difficulty with this question is more notationalwise. Perhaps it would be a good idea to first try to prove it for m=1. The general idea is very similar.

So, we want to prove that

[tex]\sum_{r=1}^m \lambda_r = \min\{tr(L)~\vert~ \dim(L)=m\}[/tex]

A good first step would be to show the existence of a subspace L with [itex]\dim(L)=m[/itex] such that

[tex]\sum_{r=1}^m \lambda_r = tr(L)[/tex]

We know that a hermitian matrix A always has a orthonormal base of eigenvectors. Use this base of eigenvectors to find a suitable L.

Thanks a loth to you both!

Here is my attemp:
We have a Hermitian matrix A, which implies that we have an orthonormal base of eigenvectors to the entire space. Since we have [itex]n[/itex] eigenvalues, we must have [itex]n[/itex] eigenvectors, and by the assumption, we can choose form such that they'll form an orthonormal basis: [itex] \{ v_1 ,..., v_n \}[/itex] where [itex]Av_i = \lambda_i v_i [/itex] .
Now my guess was that the space L we need to choose is [itex]span\{v_1,...,v_m \}[/itex].

If the quadratic form is assumed to be the one defined by "algebrat" , then we'll indeed get the needed equality.
But how can I prove this is the minimum?Thanks a lot again ! (and hope you'll be able to help me finish this)
 
  • #6
For the minimum case, perhaps it's better to first prove a special case to see what happens. So take m=1 and n=2.

So take a subspace L of dimension 1. This is generated by one element (of norm) 1. Call this element v.
Since there is an orthonormal basis of the entire space (call it {x,y}). We can write [itex]v=ax+by[/itex] for scalar a and b.

Now, what is [itex]<Av,v>[/itex]?? Use that [itex]v=ax+by[/itex].
 
  • #7
micromass said:
For the minimum case, perhaps it's better to first prove a special case to see what happens. So take m=1 and n=2.

So take a subspace L of dimension 1. This is generated by one element (of norm) 1. Call this element v.
Since there is an orthonormal basis of the entire space (call it {x,y}). We can write [itex]v=ax+by[/itex] for scalar a and b.

Now, what is [itex]<Av,v>[/itex]?? Use that [itex]v=ax+by[/itex].

OK. Let's see: We have eigenvalues [itex] \lambda_1 , \lambda_2 [/itex], with corresponding eigenvectors [itex]x,y [/itex].
We'll get: [itex] <Av,v>= a^2 \lambda_1 + b^2 \lambda_2 [/itex] . What you mean is obviously that this expression is minimal when [itex] a=1,b=0 [/itex] since [itex] \lambda_1 \leq \lambda_2 [/itex].
But what if we take [itex] b=0[/itex] and a smaller [itex] a [/itex] ?


Thanks a lot again !
 
  • #8
Combinatorics said:
OK. Let's see: We have eigenvalues [itex] \lambda_1 , \lambda_2 [/itex], with corresponding eigenvectors [itex]x,y [/itex].
We'll get: [itex] <Av,v>= a^2 \lambda_1 + b^2 \lambda_2 [/itex] . What you mean is obviously that this expression is minimal when [itex] a=1,b=0 [/itex] since [itex] \lambda_1 \leq \lambda_2 [/itex].
But what if we take [itex] b=0[/itex] and a smaller [itex] a [/itex] ?


Thanks a lot again !

Notice that v must have norm 1. So this places conditions on a and b.
 
  • #9
Great ! It implies that a+b=1 !

Then we're done! Thanks a lot !

I'll try generlize this idea. If I won't succeed I'll reply here aginThanks again !
 
  • #10
Combinatorics said:
Great ! It implies that a+b=1 !

That [itex]a^2+b^2=1[/itex].

Anyway, the general case is quite similar.
 

What is Matrix Analysis?

Matrix analysis, also known as functional analysis, is a branch of mathematics that deals with studying linear transformations on vector spaces and their corresponding matrices. It involves using techniques from linear algebra, calculus, and topology to analyze the properties and behavior of these transformations.

What are some applications of Matrix Analysis?

Matrix analysis has a wide range of applications in many fields, including physics, engineering, economics, and computer science. It is used to solve systems of linear equations, model dynamic systems, analyze networks and circuits, and perform data analysis and signal processing, among others.

What are the main concepts in Matrix Analysis?

The main concepts in matrix analysis include vector spaces, linear transformations, eigenvalues and eigenvectors, inner products, normed spaces, and projections. These concepts are used to define and study important properties of matrices and their operations, such as rank, determinant, trace, and diagonalization.

What are the differences between Matrix Analysis and Matrix Algebra?

Matrix analysis and matrix algebra are closely related, but they have some key differences. Matrix algebra is concerned with the manipulation, simplification, and computation of matrices, while matrix analysis focuses on the theoretical aspects and applications of matrices, such as studying their properties, structures, and behaviors.

What are some useful resources for learning Matrix Analysis?

There are many resources available for learning matrix analysis, including textbooks, online courses, lecture notes, and video tutorials. Some popular books on the subject include "Linear Algebra Done Right" by Sheldon Axler, "Functional Analysis" by Walter Rudin, and "Matrix Analysis" by Roger Horn and Charles Johnson. Additionally, many universities offer courses on matrix analysis as part of their mathematics or engineering programs.

Similar threads

  • Calculus and Beyond Homework Help
Replies
14
Views
429
  • Calculus and Beyond Homework Help
Replies
5
Views
939
  • Calculus and Beyond Homework Help
Replies
5
Views
10K
  • Calculus and Beyond Homework Help
Replies
13
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
878
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
Back
Top