Register to reply 
Prove V=S⊕S^(⊥) 
Share this thread: 
#1
Jan3013, 12:14 PM

P: 187

1. The problem statement, all variables and given/known data
Let ##S## be a subspace of an inner product space ##V##. Prove that ##V=S\oplus S^{\bot}##. 2. Relevant equations The circled plus is meant to indicate the orthogonal sum of two sets. From an earlier exercise, I've shown that ##S^{\bot}## is a subspace of ##V##, and that ##S\cap T = \{ 0\}## (where ##S\bot T##). Don't know if they'll be helpful to this proof, but I'll leave these results up if they will be. 3. The attempt at a solution I don't know where to begin. I've resorted to looking online for established proofs, but even those don't make sense to me. Could someone help me along with this proof? Thanks. 


#2
Jan3013, 12:16 PM

P: 187

In particular, I was referring to this pdf: https://docs.google.com/viewer?a=v&q...eIWdTVz2Wt9o6g
The theorem in question starts at the top of the second page. 


#3
Jan3013, 04:06 PM

Homework
Sci Advisor
HW Helper
Thanks ∞
P: 9,644

That proof looks fairly straightforward. Which step gives you doubts?



#4
Jan3013, 04:27 PM

P: 187

Prove V=S⊕S^(⊥)
Alright, I understand the first paragraph (since I proved that earlier myself), and I get the substitution of ##W## for ##S\oplus S^{\bot}##.
I get the last two sentences, so it looks like I just don't understand the introduction of ##e## into the proof, and how we know what we know about ##e##. Thanks for the reply. 


#5
Jan3013, 05:14 PM

Sci Advisor
HW Helper
Thanks
P: 25,251

I don't know that the proof you are looking at is taking the most economical route. If you know an orthonormal basis on a subspace can be extended (as in GramSchmidt) to an orthonormal basis on the whole space why not pick an orthonormal basis for S, and extend it to an orthonormal basis for V. So you can pick an orthonormal basis for V such that {x_1,...x_k} is a basis for S and {x_1,...,x_k,x_k+1,...,x_n} is a basis for V. What could you do with that?



#6
Jan3013, 05:38 PM

P: 187




#7
Jan3013, 06:03 PM

Sci Advisor
HW Helper
Thanks
P: 25,251




#8
Jan3013, 06:13 PM

Homework
Sci Advisor
HW Helper
Thanks ∞
P: 9,644




#9
Jan3013, 07:36 PM

P: 187

Ooh, okay, it's starting to make a lot more sense now. Thanks to both of you for the help!
I've started writing up the proof for myself using the point Dick made earlier: Let ##\{ x_1, \ldots, x_n\}## be an orthonormal basis of ##S##. Assume ##V \not= S \oplus S^\bot##. Then ##\exists a\in V## such that ##a \not\in \mbox{span}\{ x_1, \ldots, x_n\}## and ##a\bot x_i \; \forall \; i=1,2,\cdots,n##. Since ##a\bot x_i##, we have that ##a\in S^\bot##.Where do I go from here? How do I use this last fact to show the contradiction? I would think it would involve showing that a is also an element of S, but do I necessarily know that a is orthogonal to the basis of ##S^\bot##? 


#10
Jan3013, 08:05 PM

Sci Advisor
HW Helper
Thanks
P: 25,251




#11
Jan3013, 09:07 PM

P: 187

So I've got ##x_{n+1}\in S^\bot##, since ##x_{n+1}\bot x_i, \forall i = 1,2,\ldots,n##. And, like you said, ##\forall v\in V##, we can write ##v## as a linear combination ##v = a_1 v_1+...+ a_n v_n + ... +a_m v_m##, the first n terms of which can be used to express some ##x\in S##, and the remainder of terms which can be used to express some ##y\in S^\bot##. In other words, ##v = x + y##. Thank you very much! Much simpler than I thought it would be. I was under the impression that I had to use the inner product at some point. Now, the definition I have of a direct sum says that ##v## can be written uniquely as a sum ##x+y##. I still have to show that too, right? If so, I can handle it. Thanks again! 


#12
Jan3013, 09:24 PM

Sci Advisor
HW Helper
Thanks
P: 25,251




#13
Jan3013, 09:42 PM

P: 187

##v = v_1 + ... + v_n + ... + v_m = t_1 + ... + t_n + ... + t_m##, where ##v_i, t_i## are abbreviated versions of ##a_1 v_1,...## and so on. Then I get ##(v_1t_1) + ... + (v_nt_n) + ... + (v_mt_m) = 0## Here, the first n terms are in ##S## and the remaining terms are in ##S^\bot##. Then, from the orthonormal basis of ##S##, I see that ##(v_1t_1), ... (v_nt_n), ...## are linearly independent, and so ##(v_1t_1)=0 \Rightarrow v_1=t_1##, and the same goes for the other terms. 


#14
Jan3013, 10:50 PM

Sci Advisor
HW Helper
Thanks
P: 25,251




#15
Jan3013, 10:57 PM

P: 187




Register to reply 
Related Discussions  
What does A^(⊥) mean?  General Math  4  
Given a subspace S<=V, prove that there exists T<=V such that V=S⊕T.  Calculus & Beyond Homework  12  
Prove that a rational root of a monic polynomial is an integer. Use this to prove...  Calculus & Beyond Homework  1  
Help me to prove this please  Advanced Physics Homework  0  
How to prove it  Calculus & Beyond Homework  3 