Are \( \vec{u_1} \) and \( \vec{u_2} \) Orthogonal and Normalized?

cscott
Messages
778
Reaction score
1

Homework Statement



How do I prove that if,

|\vec{u_1}><\vec{u_1}| + |\vec{u_2}><\vec{u_2}| = I,

where 'I' is the indentity matrix, that u_1 and u_2 are orthogonal and normalized?

Can anybody get me started?
 
Last edited:
Physics news on Phys.org
You can't, with the information you've stated.
 
Here's the exact question from the problem set, just in case:

Show that given two arbitrary vectors u1 , u2 ∈ C^2 such that the associated projectors satisfy Pu1 + Pu2 = 1 then it holds that the vectors u1, u2 are orthogonal and normalized.

Is it really impossible?
 
Oh bah. I read the problem backwards. :frown:

Well, since yo'ure given an equation with I in it, it will probably be useful to throw I into every relevant equation you can think of.
 
That's the thing, I can't think of any relevant equations besides the two inner product equations for a normalized vector and two orthogonal vectors.
 
What if I multiplied by |u2> on the right-hand side of each term. Would I be correct in saying that in order for this new equation to hold true u1 and u2 have to be orthogonal and u2 would have to be normalized? Then I can do it again with u1.
 
cscott said:
That's the thing, I can't think of any relevant equations besides the two inner product equations for a normalized vector and two orthogonal vectors.
Well, then throw an I in them! What do you get?
 
I think I would get:

<u1|u1>I = I
<u2|u2>I = I
<u1|u2>I = 0
<u2|u1>I = 0

Does this make sense? Is it incorrect to do it like I said in post #6?

If I substituted the expression in my OP for 'I' in these equations, and they still hold true does it prove what I need?
 
cscott said:
I think I would get:

<u1|u1>I = I
<u2|u2>I = I
<u1|u2>I = 0
<u2|u1>I = 0

Does this make sense? Is it incorrect to do it like I said in post #6?

If I substituted the expression in my OP for 'I' in these equations, and they still hold true does it prove what I need?
You put the I in the wrong place. It's an operator; it should operate on things!

I suppose that you can put an I there, intending there to be scalar multiplication, but that doesn't do anything useful here.
 
  • #10
Ooh

<u1|I|u1> = I
<u2|I|u2> = I
<u1|I|u2> = 0
<u2|I|u1> = 0

Like that then?
 
  • #11
Yes -- those are the statements you want to prove!

Incidentally, this is a trick you want to take to heart; it's a very useful thing. (And not just in linear algebra; in analysis, you might use a partition of unity in similar ways)
 
  • #12
I'll try to work out the algebra.

...I think I'm slllllowly getting my head around this quantum stuff.

Thanks for your help.
 
  • #13
Hrm...

This is surely what you want to do, but finishing off the proof is less straightforward than I thought it was.
 
  • #14
cscott said:
What if I multiplied by |u2> on the right-hand side of each term. Would I be correct in saying that in order for this new equation to hold true u1 and u2 have to be orthogonal and u2 would have to be normalized? Then I can do it again with u1.
Shame on me for not fully thinking this through! :redface: You can do this... if you can first prove that |u1> and |u2> are linearly independent. (do you see why?)
 
  • #15
Hurkyl said:
Shame on me for not fully thinking this through! :redface: You can do this... if you can first prove that |u1> and |u2> are linearly independent. (do you see why?)

I see why |u1> and |u2> would have to be linearly independent. I'm not sure how I'd prove that, though. Wouldn't I have to do it based solely on the fact that adding the projection matrices gives the identity matrix?
 
  • #16
When I substitute |\vec{u_1}&gt;&lt;\vec{u_1}| + |\vec{u_2}&gt;&lt;\vec{u_2}| = I

into the LHS of
<u1|I|u1> = I
<u1|I|u2> = 0
I get
1 = I
<u1|u2> + <u1|u2> = 0
respectively

This doesn't seem right... ? It does imply that<u1|u2> = 0, though.
 
  • #17
cscott said:
I see why |u1> and |u2> would have to be linearly independent. I'm not sure how I'd prove that, though. Wouldn't I have to do it based solely on the fact that adding the projection matrices gives the identity matrix?
Assume they are dependent, see what happens. Rank might be involved.


cscott said:
When I substitute |\vec{u_1}&gt;&lt;\vec{u_1}| + |\vec{u_2}&gt;&lt;\vec{u_2}| = I

into the LHS of
<u1|I|u1> = I
<u1|I|u2> = 0
I get
1 = I
<u1|u2> + <u1|u2> = 0
respectively

This doesn't seem right... ? It does imply that<u1|u2> = 0, though.
I don't see how you got that. And remember that <u1|I|u2> = 0 is what you're trying to prove.
 
Back
Top