Proving the Orthogonal Sum of Subspaces in an Inner Product Space

  • Thread starter SithsNGiggles
  • Start date
  • Tags
    Proof
In summary: S? Or is it that all elements of S are orthogonal to the basis of S, and then I can use the fact that a is an element of S^\bot to show that a is also an element of S?In summary, to prove that ##V=S\oplus S^\bot##, we start by assuming that ##V \not= S \oplus S^\bot##. This means that there exists an element ##a\in V## that is not in the span of the orthonormal basis of ##S##. However, since this element is orthogonal to each element of the basis, it must be in the orthogonal complement of ##S##, which is ##S^\bot##. From here, we
  • #1
SithsNGiggles
186
0

Homework Statement



Let ##S## be a subspace of an inner product space ##V##. Prove that ##V=S\oplus S^{\bot}##.

Homework Equations



The circled plus is meant to indicate the orthogonal sum of two sets.
From an earlier exercise, I've shown that ##S^{\bot}## is a subspace of ##V##, and that ##S\cap T = \{ 0\}## (where ##S\bot T##). Don't know if they'll be helpful to this proof, but I'll leave these results up if they will be.

The Attempt at a Solution



I don't know where to begin. I've resorted to looking online for established proofs, but even those don't make sense to me. Could someone help me along with this proof? Thanks.
 
Physics news on Phys.org
  • #3
That proof looks fairly straightforward. Which step gives you doubts?
 
  • #4
Alright, I understand the first paragraph (since I proved that earlier myself), and I get the substitution of ##W## for ##S\oplus S^{\bot}##.

  • My first doubt arises from the next sentence. What's being done when you "choose" an orthonormal basis? Is that the same as saying, for instance, "Let ##\{ x_1, \ldots, x_n \}## be an orthonormal basis of S"?
  • Alongside this concern, what does it mean to "extend the basis to V"? Does it have anything to do with assuming the basis of S is not a basis of V (unless that's true in general)?
  • How do we know that ##e## is orthogonal to ##W##? Would it have something to do with linear independence?

I get the last two sentences, so it looks like I just don't understand the introduction of ##e## into the proof, and how we know what we know about ##e##. Thanks for the reply.
 
  • #5
I don't know that the proof you are looking at is taking the most economical route. If you know an orthonormal basis on a subspace can be extended (as in Gram-Schmidt) to an orthonormal basis on the whole space why not pick an orthonormal basis for S, and extend it to an orthonormal basis for V. So you can pick an orthonormal basis for V such that {x_1,...x_k} is a basis for S and {x_1,...,x_k,x_k+1,...,x_n} is a basis for V. What could you do with that?
 
Last edited:
  • #6
Dick said:
I don't know that the proof you are looking at is taking the most economical route. If you know an orthonormal basis on a subspace can be extended (as in Gram-Schmidt) to an orthonormal basis on the whole space why not pick an orthonormal basis for S, and extend it to an orthonormal basis for V. So you can pick an orthonormal basis for V such that {x_1,...x_k} is a basis for S and {x_1,...,x_k,x_k+1,...,x_n} is a basis for V. What could you do with that?

I think I see what you're getting at, but I still have a problem with "extending" a basis to a larger (or at the least, more inclusive) space. Sorry if it sounds dumb, but I just don't know what that means.
 
  • #7
SithsNGiggles said:
I think I see what you're getting at, but I still have a problem with "extending" a basis to a larger (or at the least, more inclusive) space. Sorry if it sounds dumb, but I just don't know what that means.

Extending a basis just means that if have a basis for a subspace {x1...xk} you can add more vectors to get a basis for the whole space {v1...vn} where vi=xi for 1<=i<=k.
 
  • #8
SithsNGiggles said:
[*]My first doubt arises from the next sentence. What's being done when you "choose" an orthonormal basis? Is that the same as saying, for instance, "Let ##\{ x_1, \ldots, x_n \}## be an orthonormal basis of S"?
Yes.
[*]Alongside this concern, what does it mean to "extend the basis to V"? Does it have anything to do with assuming the basis of S is not a basis of V (unless that's true in general)?
If it is a basis for V then S=V, so S is empty. It does appeal to the theorem that any orthonormal set can be extended to an orthonormal basis.
[*]How do we know that ##e## is orthogonal to ##W##? Would it have something to do with linear independence?
e has been chosen to be orthogonal to each element of an orthonormal basis for W. It is therefore orthogonal to every element of W.
 
  • #9
Ooh, okay, it's starting to make a lot more sense now. Thanks to both of you for the help!

I've started writing up the proof for myself using the point Dick made earlier:
Dick said:
I don't know that the proof you are looking at is taking the most economical route. If you know an orthonormal basis on a subspace can be extended (as in Gram-Schmidt) to an orthonormal basis on the whole space why not pick an orthonormal basis for S, and extend it to an orthonormal basis for V. So you can pick an orthonormal basis for V such that {x_1,...x_k} is a basis for S and {x_1,...,x_k,x_k+1,...,x_n} is a basis for V. What could you do with that?

Here's my proof so far:
Let ##\{ x_1, \ldots, x_n\}## be an orthonormal basis of ##S##. Assume ##V \not= S \oplus S^\bot##. Then ##\exists a\in V## such that ##a \not\in \mbox{span}\{ x_1, \ldots, x_n\}## and ##a\bot x_i \; \forall \; i=1,2,\cdots,n##. Since ##a\bot x_i##, we have that ##a\in S^\bot##.
Where do I go from here? How do I use this last fact to show the contradiction? I would think it would involve showing that a is also an element of S, but do I necessarily know that a is orthogonal to the basis of ##S^\bot##?
 
  • #10
SithsNGiggles said:
Ooh, okay, it's starting to make a lot more sense now. Thanks to both of you for the help!

I've started writing up the proof for myself using the point Dick made earlier:Here's my proof so far:
Let ##\{ x_1, \ldots, x_n\}## be an orthonormal basis of ##S##. Assume ##V \not= S \oplus S^\bot##. Then ##\exists a\in V## such that ##a \not\in \mbox{span}\{ x_1, \ldots, x_n\}## and ##a\bot x_i \; \forall \; i=1,2,\cdots,n##. Since ##a\bot x_i##, we have that ##a\in S^\bot##.
Where do I go from here? How do I use this last fact to show the contradiction? I would think it would involve showing that a is also an element of S, but do I necessarily know that a is orthogonal to the basis of ##S^\bot##?

I was actually trying to lead you in what I think is a simpler direction. Now you are mishmashing that with the proof in the pdf file. So I can't answer that. Put the pdf proof aside and just think about it. ##{x_1,...,x_n}## is an orthonormal basis for S. If the dimension of V is m>=n, then your can extend that basis to ##{x_1,...,x_n,x_{n+1},x_{n+2},...,x_m}##, an orthonormal basis for V. Is ##x_{n+1}## e.g. in ##S^\bot##? Remember the basis is orthonormal. Also remember since you have a basis, every element of V can be written uniquely as ##v=a_1 v_1+...+a_m v_m## for some scalars ##a_i##.
 
Last edited:
  • #11
Dick said:
I was actually trying to lead you in what I think is a simpler direction. Now you are mishmashing that with the proof in the pdf file. So I can't answer that. Put the pdf proof aside and just think about it. ##{x_1,...,x_n}## is an orthonormal basis for S. If the dimension of V is m>=n, then your can extend that basis to ##{x_1,...,x_n,x_{n+1},x_{n+2},...,x_m}##, an orthonormal basis for V. Is ##x_{n+1}## e.g. in ##S^\bot##? Remember the basis is orthonormal. Also remember since you have a basis, every element of V can be written uniquely as ##v=a_1 v_1+...+a_m v_m## for some scalars ##a_i##.

Talk about a eureka moment! I've learned I should really keep track of what a basis entails...

So I've got ##x_{n+1}\in S^\bot##, since ##x_{n+1}\bot x_i, \forall i = 1,2,\ldots,n##.

And, like you said, ##\forall v\in V##, we can write ##v## as a linear combination
##v = a_1 v_1+...+ a_n v_n + ... +a_m v_m##,
the first n terms of which can be used to express some ##x\in S##, and the remainder of terms which can be used to express some ##y\in S^\bot##. In other words, ##v = x + y##. Thank you very much! Much simpler than I thought it would be. I was under the impression that I had to use the inner product at some point.

Now, the definition I have of a direct sum says that ##v## can be written uniquely as a sum ##x+y##. I still have to show that too, right? If so, I can handle it.

Thanks again!
 
  • #12
SithsNGiggles said:
Talk about a eureka moment! I've learned I should really keep track of what a basis entails...

So I've got ##x_{n+1}\in S^\bot##, since ##x_{n+1}\bot x_i, \forall i = 1,2,\ldots,n##.

And, like you said, ##\forall v\in V##, we can write ##v## as a linear combination
##v = a_1 v_1+...+ a_n v_n + ... +a_m v_m##,
the first n terms of which can be used to express some ##x\in S##, and the remainder of terms which can be used to express some ##y\in S^\bot##. In other words, ##v = x + y##. Thank you very much! Much simpler than I thought it would be. I was under the impression that I had to use the inner product at some point.

Now, the definition I have of a direct sum says that ##v## can be written uniquely as a sum ##x+y##. I still have to show that too, right? If so, I can handle it.

Thanks again!

My last post has the word 'unique' in it. Can you find it? The heavy lifting in this proof is showing you find an orthonormal basis for a subspace and extend it to an orthonormal basis for the whole space. The proof in the pdf used it, so I'm assuming you can use it. Can you sketch out why it's true?
 
  • #13
Dick said:
My last post has the word 'unique' in it. Can you find it? The heavy lifting in this proof is showing you find an orthonormal basis for a subspace and extend it to an orthonormal basis for the whole space. The proof in the pdf used it, so I'm assuming you can use it. Can you sketch out why it's true?

Oh, I didn't notice it. Thanks for pointing it out. Yes, I'm confident that I can. I'd start by supposing ##v## can be represented as
##v = v_1 + ... + v_n + ... + v_m = t_1 + ... + t_n + ... + t_m##, where ##v_i, t_i## are abbreviated versions of ##a_1 v_1,...## and so on.

Then I get
##(v_1-t_1) + ... + (v_n-t_n) + ... + (v_m-t_m) = 0##
Here, the first n terms are in ##S## and the remaining terms are in ##S^\bot##.

Then, from the orthonormal basis of ##S##, I see that ##(v_1-t_1), ... (v_n-t_n), ...## are linearly independent, and so ##(v_1-t_1)=0 \Rightarrow v_1=t_1##, and the same goes for the other terms.
 
  • #14
SithsNGiggles said:
Oh, I didn't notice it. Thanks for pointing it out. Yes, I'm confident that I can. I'd start by supposing ##v## can be represented as
##v = v_1 + ... + v_n + ... + v_m = t_1 + ... + t_n + ... + t_m##, where ##v_i, t_i## are abbreviated versions of ##a_1 v_1,...## and so on.

Then I get
##(v_1-t_1) + ... + (v_n-t_n) + ... + (v_m-t_m) = 0##
Here, the first n terms are in ##S## and the remaining terms are in ##S^\bot##.

Then, from the orthonormal basis of ##S##, I see that ##(v_1-t_1), ... (v_n-t_n), ...## are linearly independent, and so ##(v_1-t_1)=0 \Rightarrow v_1=t_1##, and the same goes for the other terms.

Sure. That that's idea for proving that the representation of a vector in terms of an orthonormal basis is unique. What I meant is the whole idea that an orthonormal basis on a subspace can be extended to the whole space. If you already know that, don't worry about it. But proof of it is Gram-Schmidt orthogonization. Using that is what's making the proof easy.
 
  • #15
Dick said:
Sure. That that's idea for proving that the representation of a vector in terms of an orthonormal basis is unique. What I meant is the whole idea that an orthonormal basis on a subspace can be extended to the whole space. If you already know that, don't worry about it. But proof of it is Gram-Schmidt orthogonization. Using that is what's making the proof easy.

My instructor hasn't formally introduced it in a lecture, but I've heard him mention it at one time or another. I'll be sure to keep my eyes peeled for it in our notes. Thanks again!
 

1. What is the meaning of V = S ⊕ S⊥?

V = S ⊕ S⊥ is a mathematical equation that represents the decomposition of a vector space V into two subspaces, S and S⊥. The symbol ⊕ denotes the direct sum, which means that every vector in V can be written as a unique combination of vectors from S and S⊥. In other words, V is the sum of the two subspaces S and S⊥.

2. How is V = S ⊕ S⊥ useful in mathematics?

V = S ⊕ S⊥ is useful in several areas of mathematics, including linear algebra and functional analysis. It allows for a better understanding of vector spaces and their properties, and it also provides a way to solve certain problems involving linear transformations and projections.

3. What is the difference between S and S⊥?

S and S⊥ are orthogonal subspaces, meaning that they are perpendicular to each other. However, the main difference between them is that S contains all vectors that are parallel to a given subspace, while S⊥ contains all vectors that are perpendicular to that subspace.

4. Can V = S ⊕ S⊥ be extended to infinite-dimensional vector spaces?

Yes, V = S ⊕ S⊥ can be extended to infinite-dimensional vector spaces. In fact, this decomposition is particularly useful in functional analysis, where it is used to characterize certain properties of function spaces.

5. How is V = S ⊕ S⊥ related to the concept of orthogonality?

The decomposition V = S ⊕ S⊥ is closely related to the concept of orthogonality. In fact, S and S⊥ are orthogonal subspaces, meaning that every vector in S is perpendicular to every vector in S⊥. This property is often used in linear algebra to solve systems of equations and find orthogonal bases for vector spaces.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
923
  • Calculus and Beyond Homework Help
Replies
1
Views
764
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
4K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
893
  • Calculus and Beyond Homework Help
Replies
7
Views
8K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
Back
Top