How Can We Prove a Vector Lies in the Orthogonal Complement of a Subset?

rhobymic
Messages
5
Reaction score
0

Homework Statement



Let V be a complex inner product space and let S be a subset of V. Suppose that v in V is a vector for which
<s,v > + <v,s> \leq <s,s>
Prove that v is in the orthogonal set S\bot

Homework Equations



We have the three inner product relations:
1) conjugate symmetry
<x,y> = \overline{&lt;y,x&gt;}

2) linearity
<x+y,z> = <x,z>+<y,z>

3) Def of a norm
||x|| = √<x,x>

There may be more that apply such as triangle inequality or the Cauchy–Schwarz inequality
but I am not sure

The Attempt at a Solution



I know that if v is in the set S\bot
then s is orthogonal to v so <s,v> = <v,s> = 0
Therefor I am guessing through all these equations and the given inequality it can be shown that <s,v>+<v,s> needs to be both ≥ 0 and ≤ 0 therefor it will be zero

Am I thinking of the way forward correctly?
Any help towards a solution would be great!

Thanks
 
Physics news on Phys.org
rhobymic said:

Homework Statement



Let V be a complex inner product space and let S be a subset of V. Suppose that v in V is a vector for which
<s,v > + <v,s> \leq <s,s>
For all s in S? In that case, your conclusion is not true.
If v is orthogonal to S, this is saying that <s, s>= 0 for all s in S.

Prove that v is in the orthogonal set S\bot

Homework Equations



We have the three inner product relations:
1) conjugate symmetry
<x,y> = \overline{&lt;y,x&gt;}

2) linearity
<x+y,z> = <x,z>+<y,z>

3) Def of a norm
||x|| = √<x,x>

There may be more that apply such as triangle inequality or the Cauchy–Schwarz inequality
but I am not sure

The Attempt at a Solution



I know that if v is in the set S\bot
then s is orthogonal to v so <s,v> = <v,s> = 0
Therefor I am guessing through all these equations and the given inequality it can be shown that <s,v>+<v,s> needs to be both ≥ 0 and ≤ 0 therefor it will be zero

Am I thinking of the way forward correctly?
Any help towards a solution would be great!

Thanks
 
HallsofIvy said:
For all s in S? In that case, your conclusion is not true.
If v is orthogonal to S, this is saying that <s, s>= 0 for all s in S.

I did for get to state that this was for all s in S.

Could you explain a little more why <s,s>=0

would you be referring to the fact that we may be able to pick a case where v=s and therefor
2<s,s> ≤ <s,s>
and this would only happen when <s,s> = 0 because <x,x> ≥ 0?
 
If v is in the orthogonal complement to S, then <v, s>= <s, v>= 0 so <s, s>= <v, s>+ <s, v>= 0+ 0= 0.

(No, I was not reffering to the case where v= s. If s is in S and v is in the orthogonal complement to S, then v= s only if v= s= 0.)
 
HallsofIvy said:
If v is in the orthogonal complement to S, then <v, s>= <s, v>= 0 so <s, s>= <v, s>+ <s, v>= 0+ 0= 0.

(No, I was not reffering to the case where v= s. If s is in S and v is in the orthogonal complement to S, then v= s only if v= s= 0.)

We are given that <s, s>≥ <v, s>+ <s, v> not that they are equal so really saying that
<v, s>+ <s, v> =0 ( something we know will be true seeing as we are proving v is in the orthogonal complement of S) only shows that <s,s> can be 0 or anything greater than zero, not defiantly 0
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top