State vectors, projection matrices

In summary, the conversation discusses a problem about proving the orthogonality and normalization of two arbitrary vectors u1 and u2, given that their associated projectors satisfy a specific equation. The conversation includes attempts at using the identity matrix and inner product equations to prove the desired result, and concludes with a suggestion to prove linear independence as a key step in the proof.
  • #1
cscott
782
1

Homework Statement



How do I prove that if,

[tex]|\vec{u_1}><\vec{u_1}| + |\vec{u_2}><\vec{u_2}| = I[/tex],

where 'I' is the indentity matrix, that [itex]u_1[/itex] and [itex]u_2[/itex] are orthogonal and normalized?

Can anybody get me started?
 
Last edited:
Physics news on Phys.org
  • #2
You can't, with the information you've stated.
 
  • #3
Here's the exact question from the problem set, just in case:

Show that given two arbitrary vectors u1 , u2 ∈ C^2 such that the associated projectors satisfy Pu1 + Pu2 = 1 then it holds that the vectors u1, u2 are orthogonal and normalized.

Is it really impossible?
 
  • #4
Oh bah. I read the problem backwards. :frown:

Well, since yo'ure given an equation with I in it, it will probably be useful to throw I into every relevant equation you can think of.
 
  • #5
That's the thing, I can't think of any relevant equations besides the two inner product equations for a normalized vector and two orthogonal vectors.
 
  • #6
What if I multiplied by |u2> on the right-hand side of each term. Would I be correct in saying that in order for this new equation to hold true u1 and u2 have to be orthogonal and u2 would have to be normalized? Then I can do it again with u1.
 
  • #7
cscott said:
That's the thing, I can't think of any relevant equations besides the two inner product equations for a normalized vector and two orthogonal vectors.
Well, then throw an I in them! What do you get?
 
  • #8
I think I would get:

<u1|u1>I = I
<u2|u2>I = I
<u1|u2>I = 0
<u2|u1>I = 0

Does this make sense? Is it incorrect to do it like I said in post #6?

If I substituted the expression in my OP for 'I' in these equations, and they still hold true does it prove what I need?
 
  • #9
cscott said:
I think I would get:

<u1|u1>I = I
<u2|u2>I = I
<u1|u2>I = 0
<u2|u1>I = 0

Does this make sense? Is it incorrect to do it like I said in post #6?

If I substituted the expression in my OP for 'I' in these equations, and they still hold true does it prove what I need?
You put the I in the wrong place. It's an operator; it should operate on things!

I suppose that you can put an I there, intending there to be scalar multiplication, but that doesn't do anything useful here.
 
  • #10
Ooh

<u1|I|u1> = I
<u2|I|u2> = I
<u1|I|u2> = 0
<u2|I|u1> = 0

Like that then?
 
  • #11
Yes -- those are the statements you want to prove!

Incidentally, this is a trick you want to take to heart; it's a very useful thing. (And not just in linear algebra; in analysis, you might use a partition of unity in similar ways)
 
  • #12
I'll try to work out the algebra.

...I think I'm slllllowly getting my head around this quantum stuff.

Thanks for your help.
 
  • #13
Hrm...

This is surely what you want to do, but finishing off the proof is less straightforward than I thought it was.
 
  • #14
cscott said:
What if I multiplied by |u2> on the right-hand side of each term. Would I be correct in saying that in order for this new equation to hold true u1 and u2 have to be orthogonal and u2 would have to be normalized? Then I can do it again with u1.
Shame on me for not fully thinking this through! :redface: You can do this... if you can first prove that |u1> and |u2> are linearly independent. (do you see why?)
 
  • #15
Hurkyl said:
Shame on me for not fully thinking this through! :redface: You can do this... if you can first prove that |u1> and |u2> are linearly independent. (do you see why?)

I see why |u1> and |u2> would have to be linearly independent. I'm not sure how I'd prove that, though. Wouldn't I have to do it based solely on the fact that adding the projection matrices gives the identity matrix?
 
  • #16
When I substitute [tex]|\vec{u_1}><\vec{u_1}| + |\vec{u_2}><\vec{u_2}| = I[/tex]

into the LHS of
<u1|I|u1> = I
<u1|I|u2> = 0
I get
1 = I
<u1|u2> + <u1|u2> = 0
respectively

This doesn't seem right... ? It does imply that<u1|u2> = 0, though.
 
  • #17
cscott said:
I see why |u1> and |u2> would have to be linearly independent. I'm not sure how I'd prove that, though. Wouldn't I have to do it based solely on the fact that adding the projection matrices gives the identity matrix?
Assume they are dependent, see what happens. Rank might be involved.


cscott said:
When I substitute [tex]|\vec{u_1}><\vec{u_1}| + |\vec{u_2}><\vec{u_2}| = I[/tex]

into the LHS of
<u1|I|u1> = I
<u1|I|u2> = 0
I get
1 = I
<u1|u2> + <u1|u2> = 0
respectively

This doesn't seem right... ? It does imply that<u1|u2> = 0, though.
I don't see how you got that. And remember that <u1|I|u2> = 0 is what you're trying to prove.
 

Related to State vectors, projection matrices

1. What is a state vector?

A state vector is a mathematical representation of the state of a system at a specific point in time. It contains all the relevant variables and their values that describe the state of the system.

2. How is a state vector used in science?

A state vector is used in science to model and analyze the behavior of systems in fields such as physics, engineering, and economics. It allows scientists to make predictions and understand the dynamics of a system.

3. What is a projection matrix?

A projection matrix is a square matrix that is used to transform a vector into a lower-dimensional space. It is used to project higher-dimensional data onto a lower-dimensional space for easier analysis and visualization.

4. How is a projection matrix applied in real-world scenarios?

A projection matrix is applied in various real-world scenarios, such as computer graphics, data compression, and machine learning. It is used to reduce the dimensionality of data and make it easier to process and analyze.

5. What is the relationship between state vectors and projection matrices?

State vectors and projection matrices are related in that a state vector can be projected onto a lower-dimensional space using a projection matrix. This allows for easier analysis and understanding of the system's behavior in a simplified form.

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
2K
Replies
10
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
760
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Advanced Physics Homework Help
Replies
3
Views
2K
  • Advanced Physics Homework Help
Replies
1
Views
2K
  • Advanced Physics Homework Help
Replies
4
Views
3K
  • Quantum Physics
Replies
16
Views
2K
  • Advanced Physics Homework Help
Replies
1
Views
2K
Back
Top