Prove V=S⊕S^(⊥)


by SithsNGiggles
Tags: prove, vs⊕s⊥
SithsNGiggles
SithsNGiggles is offline
#1
Jan30-13, 12:14 PM
P: 181
1. The problem statement, all variables and given/known data

Let ##S## be a subspace of an inner product space ##V##. Prove that ##V=S\oplus S^{\bot}##.

2. Relevant equations

The circled plus is meant to indicate the orthogonal sum of two sets.
From an earlier exercise, I've shown that ##S^{\bot}## is a subspace of ##V##, and that ##S\cap T = \{ 0\}## (where ##S\bot T##). Don't know if they'll be helpful to this proof, but I'll leave these results up if they will be.

3. The attempt at a solution

I don't know where to begin. I've resorted to looking online for established proofs, but even those don't make sense to me. Could someone help me along with this proof? Thanks.
Phys.Org News Partner Science news on Phys.org
Better thermal-imaging lens from waste sulfur
Hackathon team's GoogolPlex gives Siri extra powers
Bright points in Sun's atmosphere mark patterns deep in its interior
SithsNGiggles
SithsNGiggles is offline
#2
Jan30-13, 12:16 PM
P: 181
In particular, I was referring to this pdf: https://docs.google.com/viewer?a=v&q...eIWdTVz2Wt9o6g

The theorem in question starts at the top of the second page.
haruspex
haruspex is offline
#3
Jan30-13, 04:06 PM
Homework
Sci Advisor
HW Helper
Thanks ∞
P: 9,154
That proof looks fairly straightforward. Which step gives you doubts?

SithsNGiggles
SithsNGiggles is offline
#4
Jan30-13, 04:27 PM
P: 181

Prove V=S⊕S^(⊥)


Alright, I understand the first paragraph (since I proved that earlier myself), and I get the substitution of ##W## for ##S\oplus S^{\bot}##.
  • My first doubt arises from the next sentence. What's being done when you "choose" an orthonormal basis? Is that the same as saying, for instance, "Let ##\{ x_1, \ldots, x_n \}## be an orthonormal basis of S"?
  • Alongside this concern, what does it mean to "extend the basis to V"? Does it have anything to do with assuming the basis of S is not a basis of V (unless that's true in general)?
  • How do we know that ##e## is orthogonal to ##W##? Would it have something to do with linear independence?

I get the last two sentences, so it looks like I just don't understand the introduction of ##e## into the proof, and how we know what we know about ##e##. Thanks for the reply.
Dick
Dick is offline
#5
Jan30-13, 05:14 PM
Sci Advisor
HW Helper
Thanks
P: 25,167
I don't know that the proof you are looking at is taking the most economical route. If you know an orthonormal basis on a subspace can be extended (as in Gram-Schmidt) to an orthonormal basis on the whole space why not pick an orthonormal basis for S, and extend it to an orthonormal basis for V. So you can pick an orthonormal basis for V such that {x_1,...x_k} is a basis for S and {x_1,...,x_k,x_k+1,...,x_n} is a basis for V. What could you do with that?
SithsNGiggles
SithsNGiggles is offline
#6
Jan30-13, 05:38 PM
P: 181
Quote Quote by Dick View Post
I don't know that the proof you are looking at is taking the most economical route. If you know an orthonormal basis on a subspace can be extended (as in Gram-Schmidt) to an orthonormal basis on the whole space why not pick an orthonormal basis for S, and extend it to an orthonormal basis for V. So you can pick an orthonormal basis for V such that {x_1,...x_k} is a basis for S and {x_1,...,x_k,x_k+1,...,x_n} is a basis for V. What could you do with that?
I think I see what you're getting at, but I still have a problem with "extending" a basis to a larger (or at the least, more inclusive) space. Sorry if it sounds dumb, but I just don't know what that means.
Dick
Dick is offline
#7
Jan30-13, 06:03 PM
Sci Advisor
HW Helper
Thanks
P: 25,167
Quote Quote by SithsNGiggles View Post
I think I see what you're getting at, but I still have a problem with "extending" a basis to a larger (or at the least, more inclusive) space. Sorry if it sounds dumb, but I just don't know what that means.
Extending a basis just means that if have a basis for a subspace {x1...xk} you can add more vectors to get a basis for the whole space {v1...vn} where vi=xi for 1<=i<=k.
haruspex
haruspex is offline
#8
Jan30-13, 06:13 PM
Homework
Sci Advisor
HW Helper
Thanks ∞
P: 9,154
Quote Quote by SithsNGiggles View Post
[*]My first doubt arises from the next sentence. What's being done when you "choose" an orthonormal basis? Is that the same as saying, for instance, "Let ##\{ x_1, \ldots, x_n \}## be an orthonormal basis of S"?
Yes.
[*]Alongside this concern, what does it mean to "extend the basis to V"? Does it have anything to do with assuming the basis of S is not a basis of V (unless that's true in general)?
If it is a basis for V then S=V, so S is empty. It does appeal to the theorem that any orthonormal set can be extended to an orthonormal basis.
[*]How do we know that ##e## is orthogonal to ##W##? Would it have something to do with linear independence?
e has been chosen to be orthogonal to each element of an orthonormal basis for W. It is therefore orthogonal to every element of W.
SithsNGiggles
SithsNGiggles is offline
#9
Jan30-13, 07:36 PM
P: 181
Ooh, okay, it's starting to make a lot more sense now. Thanks to both of you for the help!

I've started writing up the proof for myself using the point Dick made earlier:
Quote Quote by Dick View Post
I don't know that the proof you are looking at is taking the most economical route. If you know an orthonormal basis on a subspace can be extended (as in Gram-Schmidt) to an orthonormal basis on the whole space why not pick an orthonormal basis for S, and extend it to an orthonormal basis for V. So you can pick an orthonormal basis for V such that {x_1,...x_k} is a basis for S and {x_1,...,x_k,x_k+1,...,x_n} is a basis for V. What could you do with that?
Here's my proof so far:
Let ##\{ x_1, \ldots, x_n\}## be an orthonormal basis of ##S##. Assume ##V \not= S \oplus S^\bot##. Then ##\exists a\in V## such that ##a \not\in \mbox{span}\{ x_1, \ldots, x_n\}## and ##a\bot x_i \; \forall \; i=1,2,\cdots,n##. Since ##a\bot x_i##, we have that ##a\in S^\bot##.
Where do I go from here? How do I use this last fact to show the contradiction? I would think it would involve showing that a is also an element of S, but do I necessarily know that a is orthogonal to the basis of ##S^\bot##?
Dick
Dick is offline
#10
Jan30-13, 08:05 PM
Sci Advisor
HW Helper
Thanks
P: 25,167
Quote Quote by SithsNGiggles View Post
Ooh, okay, it's starting to make a lot more sense now. Thanks to both of you for the help!

I've started writing up the proof for myself using the point Dick made earlier:


Here's my proof so far:
Let ##\{ x_1, \ldots, x_n\}## be an orthonormal basis of ##S##. Assume ##V \not= S \oplus S^\bot##. Then ##\exists a\in V## such that ##a \not\in \mbox{span}\{ x_1, \ldots, x_n\}## and ##a\bot x_i \; \forall \; i=1,2,\cdots,n##. Since ##a\bot x_i##, we have that ##a\in S^\bot##.
Where do I go from here? How do I use this last fact to show the contradiction? I would think it would involve showing that a is also an element of S, but do I necessarily know that a is orthogonal to the basis of ##S^\bot##?
I was actually trying to lead you in what I think is a simpler direction. Now you are mishmashing that with the proof in the pdf file. So I can't answer that. Put the pdf proof aside and just think about it. ##{x_1,....,x_n}## is an orthonormal basis for S. If the dimension of V is m>=n, then your can extend that basis to ##{x_1,...,x_n,x_{n+1},x_{n+2},...,x_m}##, an orthonormal basis for V. Is ##x_{n+1}## e.g. in ##S^\bot##? Remember the basis is orthonormal. Also remember since you have a basis, every element of V can be written uniquely as ##v=a_1 v_1+...+a_m v_m## for some scalars ##a_i##.
SithsNGiggles
SithsNGiggles is offline
#11
Jan30-13, 09:07 PM
P: 181
Quote Quote by Dick View Post
I was actually trying to lead you in what I think is a simpler direction. Now you are mishmashing that with the proof in the pdf file. So I can't answer that. Put the pdf proof aside and just think about it. ##{x_1,....,x_n}## is an orthonormal basis for S. If the dimension of V is m>=n, then your can extend that basis to ##{x_1,...,x_n,x_{n+1},x_{n+2},...,x_m}##, an orthonormal basis for V. Is ##x_{n+1}## e.g. in ##S^\bot##? Remember the basis is orthonormal. Also remember since you have a basis, every element of V can be written uniquely as ##v=a_1 v_1+...+a_m v_m## for some scalars ##a_i##.
Talk about a eureka moment! I've learned I should really keep track of what a basis entails...

So I've got ##x_{n+1}\in S^\bot##, since ##x_{n+1}\bot x_i, \forall i = 1,2,\ldots,n##.

And, like you said, ##\forall v\in V##, we can write ##v## as a linear combination
##v = a_1 v_1+...+ a_n v_n + ... +a_m v_m##,
the first n terms of which can be used to express some ##x\in S##, and the remainder of terms which can be used to express some ##y\in S^\bot##. In other words, ##v = x + y##. Thank you very much! Much simpler than I thought it would be. I was under the impression that I had to use the inner product at some point.

Now, the definition I have of a direct sum says that ##v## can be written uniquely as a sum ##x+y##. I still have to show that too, right? If so, I can handle it.

Thanks again!
Dick
Dick is offline
#12
Jan30-13, 09:24 PM
Sci Advisor
HW Helper
Thanks
P: 25,167
Quote Quote by SithsNGiggles View Post
Talk about a eureka moment! I've learned I should really keep track of what a basis entails...

So I've got ##x_{n+1}\in S^\bot##, since ##x_{n+1}\bot x_i, \forall i = 1,2,\ldots,n##.

And, like you said, ##\forall v\in V##, we can write ##v## as a linear combination
##v = a_1 v_1+...+ a_n v_n + ... +a_m v_m##,
the first n terms of which can be used to express some ##x\in S##, and the remainder of terms which can be used to express some ##y\in S^\bot##. In other words, ##v = x + y##. Thank you very much! Much simpler than I thought it would be. I was under the impression that I had to use the inner product at some point.

Now, the definition I have of a direct sum says that ##v## can be written uniquely as a sum ##x+y##. I still have to show that too, right? If so, I can handle it.

Thanks again!
My last post has the word 'unique' in it. Can you find it? The heavy lifting in this proof is showing you find an orthonormal basis for a subspace and extend it to an orthonormal basis for the whole space. The proof in the pdf used it, so I'm assuming you can use it. Can you sketch out why it's true?
SithsNGiggles
SithsNGiggles is offline
#13
Jan30-13, 09:42 PM
P: 181
Quote Quote by Dick View Post
My last post has the word 'unique' in it. Can you find it? The heavy lifting in this proof is showing you find an orthonormal basis for a subspace and extend it to an orthonormal basis for the whole space. The proof in the pdf used it, so I'm assuming you can use it. Can you sketch out why it's true?
Oh, I didn't notice it. Thanks for pointing it out. Yes, I'm confident that I can. I'd start by supposing ##v## can be represented as
##v = v_1 + ... + v_n + ... + v_m = t_1 + ... + t_n + ... + t_m##, where ##v_i, t_i## are abbreviated versions of ##a_1 v_1,...## and so on.

Then I get
##(v_1-t_1) + ... + (v_n-t_n) + ... + (v_m-t_m) = 0##
Here, the first n terms are in ##S## and the remaining terms are in ##S^\bot##.

Then, from the orthonormal basis of ##S##, I see that ##(v_1-t_1), ... (v_n-t_n), ...## are linearly independent, and so ##(v_1-t_1)=0 \Rightarrow v_1=t_1##, and the same goes for the other terms.
Dick
Dick is offline
#14
Jan30-13, 10:50 PM
Sci Advisor
HW Helper
Thanks
P: 25,167
Quote Quote by SithsNGiggles View Post
Oh, I didn't notice it. Thanks for pointing it out. Yes, I'm confident that I can. I'd start by supposing ##v## can be represented as
##v = v_1 + ... + v_n + ... + v_m = t_1 + ... + t_n + ... + t_m##, where ##v_i, t_i## are abbreviated versions of ##a_1 v_1,...## and so on.

Then I get
##(v_1-t_1) + ... + (v_n-t_n) + ... + (v_m-t_m) = 0##
Here, the first n terms are in ##S## and the remaining terms are in ##S^\bot##.

Then, from the orthonormal basis of ##S##, I see that ##(v_1-t_1), ... (v_n-t_n), ...## are linearly independent, and so ##(v_1-t_1)=0 \Rightarrow v_1=t_1##, and the same goes for the other terms.
Sure. That that's idea for proving that the representation of a vector in terms of an orthonormal basis is unique. What I meant is the whole idea that an orthonormal basis on a subspace can be extended to the whole space. If you already know that, don't worry about it. But proof of it is Gram-Schmidt orthogonization. Using that is what's making the proof easy.
SithsNGiggles
SithsNGiggles is offline
#15
Jan30-13, 10:57 PM
P: 181
Quote Quote by Dick View Post
Sure. That that's idea for proving that the representation of a vector in terms of an orthonormal basis is unique. What I meant is the whole idea that an orthonormal basis on a subspace can be extended to the whole space. If you already know that, don't worry about it. But proof of it is Gram-Schmidt orthogonization. Using that is what's making the proof easy.
My instructor hasn't formally introduced it in a lecture, but I've heard him mention it at one time or another. I'll be sure to keep my eyes peeled for it in our notes. Thanks again!


Register to reply

Related Discussions
What does A^(⊥) mean? General Math 4
Given a subspace S<=V, prove that there exists T<=V such that V=S⊕T. Calculus & Beyond Homework 12
Prove that a rational root of a monic polynomial is an integer. Use this to prove... Calculus & Beyond Homework 1
help me to prove this please Advanced Physics Homework 0
how to prove it Calculus & Beyond Homework 3