• Support PF! Buy your school textbooks, materials and every day products Here!

Dimensionality of the sum of subspaces

  • #1
890
38

Homework Statement



Suppose that ## \mathbb {V}_1^{n_1} ## and ## \mathbb {V}_2^{n_2} ## are two subspaces such that any element of ## \mathbb {V}_1^{n_1} ## is orthogonal to any element of ## \mathbb {V}_2^{n_2} ## . Show that dimensionality of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## is ## n_1 +n_2##.

Homework Equations




The Attempt at a Solution


Any element ## V_{1+2} ## of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## can be expressed as a vector sum of ## V_1 ## and ## V_2 ## where ## V_1 \in \mathbb {V_1 ^{n_1}} ## and ## V_2 \in \mathbb { V_2 ^{n_2}} ##.
## V_{1+2} = V_1 +V_2 ##
Since ## V_1 ## and ## V_2 ## are orthogonal to each other, these are linearly independent.

## V_1 = \sum_{i=1}^{n_1} v_i \alpha_i ##, where ## \alpha_i ## are basis vectors of ## \mathbb {V _1^{n_1} }## and ## \alpha_i ## are the corresponding coefficients.

## V_2= \sum_{i=1}^{n _2}w_i \beta_i ##, where ## \beta_i ## are basis vectors of ## \mathbb {V _2^{n_2} }## and ## \beta_i ## are the corresponding coefficients.

Then,
## V_1 + V_2= \sum_{i=1}^{n_1} v_i \alpha_i + \sum_{i=1}^{n _2}w_i \beta_i ##, where ## \alpha_i ## and ## \beta_i ## are basis vectors of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ##.

Thus, the basis of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## consists of ## n_1 +n_2 ## linearly independent vectors. Hence, dimensionality of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## is ## n_1 +n_2 ##.

Is this correct?
 

Answers and Replies

  • #2
33,273
4,982

Homework Statement



Suppose that ## \mathbb {V}_1^{n_1} ## and ## \mathbb {V}_2^{n_2} ## are two subspaces such that any element of ## \mathbb {V}_1^{n_1} ## is orthogonal to any element of ## \mathbb {V}_2^{n_2} ## . Show that dimensionality of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## is ## n_1 +n_2##.

Homework Equations




The Attempt at a Solution


Any element ## V_{1+2} ## of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## can be expressed as a vector sum of ## V_1 ## and ## V_2 ## where ## V_1 \in \mathbb {V_1 ^{n_1}} ## and ## V_2 \in \mathbb { V_2 ^{n_2}} ##.
## V_{1+2} = V_1 +V_2 ##
Since ## V_1 ## and ## V_2 ## are orthogonal to each other, these are linearly independent.
A set of vectors can be linearly independent, but not subspaces.
Pushoam said:
## V_1 = \sum_{i=1}^{n_1} v_i \alpha_i ##, where ## \alpha_i ## are basis vectors of ## \mathbb {V _1^{n_1} }## and ## \alpha_i ## are the corresponding coefficients.
##V_1## isn't a single sum -- it's the set of all possible linear combinations of the basis vectors.
Pushoam said:
## V_2= \sum_{i=1}^{n _2}w_i \beta_i ##, where ## \beta_i ## are basis vectors of ## \mathbb {V _2^{n_2} }## and ## \beta_i ## are the corresponding coefficients.
Same as above.
Pushoam said:
Then,
## V_1 + V_2= \sum_{i=1}^{n_1} v_i \alpha_i + \sum_{i=1}^{n _2}w_i \beta_i ##, where ## \alpha_i ## and ## \beta_i ## are basis vectors of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ##.

Thus, the basis of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## consists of ## n_1 +n_2 ## linearly independent vectors. Hence, dimensionality of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## is ## n_1 +n_2 ##.

Is this correct?
I don't think so.
Write the equation ##c_1\vec {u_1} + c_2 \vec {u_2} + \dots + c_{n_1}\vec{u_{n_1}} + d_1\vec{v_1} + d_2 \vec {v_2} + \dots + d_{n_2}\vec{v_{n_2}} = \vec 0##.
Show that the only solution in the constants ##c_i, d_i## is the trivial solution, using the fact that each vector in ##V_1## is orthogonal to each vector in ##V_2##.
 
  • #3
890
38
A set of vectors can be linearly independent, but not subspaces.
## V_1 ## and ## V_2 ## are vectors corresponding to subspaces ## \mathbb {V}_1^{n_1} ## and ## \mathbb {V}_2^{n_2}##.

Any element ## V_{1+2} ## of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2}## can be expressed as a vector sum of ## V_1## and ## V_2 ##where ## V_1 \in \mathbb {V_1 ^{n_1}}## and ## V_2 \in \mathbb { V_2 ^{n_2}} ##.
## V_{1+2} = V_1 +V_2##
I don't think so.
Why?
 
  • #4
33,273
4,982
For ##V_1## and ##V_2## I misunderstood your notation. Vectors are typically written with lowercase letters, such as ##\vec v## rather than ##\vec V##.
Why?
Your argument seems somewhat like handwaving to me, which is why I suggested that you use the definition of linearly independent vectors.
 
  • #5
12,669
9,197
Are you allowed to use dimension formulas? What do you know about orthogonality and the properties of the inner product? Short: Had you filled out section 2 of the template, the discussion could be significantly shorter. It is always a good strategy to gather what you have, before you start reasoning.
 
  • #6
890
38
Are you allowed to use dimension formulas? What do you know about orthogonality and the properties of the inner product?
I have not studied dimension formula yet.
Two vectors are orthogonal if their inner product is zero.
 
  • #7
890
38
Write the equation ##c_1 \vec {u_1} + c_2 \vec {u_2} + \dots + c_{n_1}\vec{u_{n_1}} + d_1\vec{v_1} + d_2 \vec {v_2} + \dots + d_{n_2}\vec{v_{n_2}} = \vec 0##.
Show that the only solution in the constants ##c_i, d_i ## is the trivial solution, using the fact that each vector in ##V_1## is orthogonal to each vector in ##V_2##.
This proves that ##\vec {u_i} ## and ##\vec {v_i}## are linearly independent. This does not prove that basis of ## \mathbb { V_1^{n_1}} ## consists of ## n_1 + n_2 ## vectors.
 
  • #8
12,669
9,197
I have not studied dimension formula yet.
A pity. It would have made the proof easier. Nevertheless, you have ##\dim (V_1+V_2) \leq n_1+n_2##. Now we want to show that equality holds. So assume the opposite, i.e. let's have ##\dim (V_1+V_2) < n_1+n_2\,.## What does this mean, given two bases for ##V_1## and ##V_2##?
Two vectors are orthogonal if their inner product is zero.
... plus the properties of the inner product!
 
  • #9
33,273
4,982
This proves that ##\vec {u_i} ## and ##\vec {v_i}## are linearly independent. This does not prove that basis of ## \mathbb { V_1^{n_1}} ## consists of ## n_1 + n_2 ## vectors.
I think you meant ##V_{1 + 2}## (which I would write more simply as just ##V##. If you can show that the equation I wrote has only the trivial solution, that shows that all of the basis vectors in ##V_1## and all of the basis vectors in ##V_2## are linearly independent. Since all of these vectors span ##V## (or ##V_{1 + 2}##), the u and v vectors are a basis for ##V##. How many vectors are there all together? That's the dimension of your space V.
 
Last edited:
  • #10
890
38
A pity. It would have made the proof easier. Nevertheless, you have ##\dim (V_1+V_2) \leq n_1+n_2##. Now we want to show that equality holds. So assume the opposite, i.e. let's have ##\dim (V_1+V_2) < n_1+n_2\,.## What does this mean, given two bases for ##V_1## and ##V_2##?

... plus the properties of the inner product!
Gram - Schimdt theorem says that we can have an orthogonal basis for any vector space. Consequently ## \mathbb{ V_1^{n_1}} ## and ## \mathbb{ V_2^{n_2}} ## consists of ## n_1 ## and ## n_2## orthogonal basis vectors.
Since any vector of ## V_1^{n_1} ## is orthogonal to any vector of ##V_2^{n_2} ##, ## V_1^{n_1} + V_2^{n_2} ## must consist of ## n_1+n_2 ## orthogonal vectors. So, the dimension of ## \mathbb{ V_1^{n_1}} + \mathbb { V_2^{n_2}} ## can't be less than ## n_1+n_2 ## . But, why can't it be greater than ## n_1+n_2 ## ?
 
Last edited by a moderator:
  • #11
890
38
I think you meant ##V_{1 + 2}## (which I would write more simply as just ##V##. If you can show that the equation I wrote has only the trivial solution, that shows that all of the basis vectors in ##V_1## and all of the basis vectors in ##V_2## are linearly independent. Since all of these vectors span ##V## (or ##V_{1 + 2}##), the u and v vectors are a basis for ##V##. How many vectors are there all together? That's the dimension of your space V.
##c_1\vec {u_1} + c_2 \vec {u_2} + \dots + c_{n_1}\vec{u_{n_1}} + d_1\vec{v_1} + d_2 \vec {v_2} + \dots + d_{n_2}\vec{v_{n_2}} = \vec 0 ##
Taking dot product with ## \vec u_i ## gives ## c_i = 0## as all other vectors are orthogonal to ## \vec u_i ##. Similarly, ## d_i = 0##.
 
  • #12
12,669
9,197
Gram - Schimdt theorem says that we can have an orthogonal basis for any vector space. Consequently ## \mathbb{ V_1^{n_1}} ## and ## \mathbb{ V_2^{n_2}} ## consists of ## n_1 ## and ## n_2## orthogonal basis vectors.
Since any vector of ## V_1^{n_1} ## is orthogonal to any vector of ##V_2^{n_2} ##, ## V_1^{n_1} + V_2^{n_2} ## must consist of ## n_1+n_2 ## orthogonal vectors. So, the dimension of ## \mathbb{ V_1^{n_1}} + \mathbb { V_2^{n_2}} ## can't be less than ## n_1+n_2 ## .
This is not nearly as obvious as you pretend and the point where @Mark44 said "hand wavy".

If ##A## is a set of linear independent vectors, say e.g. ##A=\{\,(1,0),(0,1)\,\}## in ##\mathbb{R}^2## and ##B## a set of linear independent vectors, say ##B=\{\,(1,1)\,\}## in ##\mathbb{R}^2##, do you think ##\{\,(1,0),(0,1),(1,1)\,\}## are linear independent? From "since they are orthogonal" to "##\dim(V_1+V_2) \geq n_1+n_2\,##" is exactly what you need to prove.
But, why can't it be greater than ## n_1+n_2 ## ?
This is actually the trivial part, not the other one.

Let me change your notation first:
Usually scalars are written by Greek letters and vectors by Latin letters, so just the other way around than you did. It's not important but to keep conventions makes it a lot easier to read. So let ##\{\,v_1,\ldots ,v_{n_1}\,\} \subseteq V_1\, , \,\{\,w_1,\ldots ,w_{n_2}\,\} \subseteq V_2## be the basis vectors.

Assume ##\dim (V_1+V_2) > n_1+n_2## and let ##\{\,u,v_1,\ldots ,v_{n_1},w_1,\ldots,w_{n_2}\,\}## be linear independent. Now show me how ##u\in \operatorname{span}\{\,v_1,\ldots ,v_{n_1},w_1,\ldots,w_{n_2}\,\}## is possible!

Back to the proof.
Assume ##\dim (V_1+V_2) < n_1+n_2## and w.l.o.g. ##v_1 \in \operatorname{span}\{\,v_2,\ldots ,v_{n_1},w_1,\ldots,w_{n_2}\,\}\,.## How would you proceed?
 
  • #13
890
38
If A is a set of linear independent vectors, say e.g. ##A=\{\,(1,0),(0,1)\,\} ## in ##\mathbb{R}^2 ## and B a set of linear independent vectors, say ##B=\{\,(1,1)\,\} in ##\mathbb{R}^2##, do you think ##\{\,(1,0),(0,1),(1,1)\,\} ##are linear independent?
No, ##\{\,(1,0),(0,1),(1,1)\,\} ## are not linearly independent. It is because each element of A is not orthogonal to each element of B. Thanks for giving me this example.
In the OP question, each element of##V_1## is orthogonal to each element of ##V_2##. So, a set consisting of orthogonal basis vectors of both ## V_1## and ## V_2## will be linearly independent.
This is what I wrote in the below post:
Since any vector of ##V_1^{n_1}## is orthogonal to any vector of ##V_2^{n_2}## , ##V_1^{n_1} + V_2^{n_2}## must consist of ## n_1+n_2## orthogonal vectors. So, the dimension of ## \mathbb{ V_1^{n_1}} + \mathbb { V_2^{n_2}} ##can't be less than ## n_1+n_2## .
What is meant by w.l.o.g.?
 
Last edited:
  • #14
verty
Homework Helper
2,164
198
Suppose that ## \mathbb {V}_1^{n_1} ## and ## \mathbb {V}_2^{n_2} ## are two subspaces such that any element of ## \mathbb {V}_1^{n_1} ## is orthogonal to any element of ## \mathbb {V}_2^{n_2} ## . Show that dimensionality of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## is ## n_1 +n_2##.
Pushoam, I think Fresh_42 is saying you need to start like this:
- assume ##dim(V) < n_1 + n_2##, then...
- assume ##dim(V) > n_1 + n_2##, then...
 
  • #15
12,669
9,197
What is meant by w.l.o.g.?
It means without loss of generality, i.e. since one vector is linear dependent of the others by assumption, we can assume it is the first one, as otherwise we would just renumber them accordingly.

If you assume, that the basis in ##V_1## and ##V_2## are already orthogonal, then you're done by the argument in post #11. But the problem in the first post doesn't say this. So you also need what you wrote in post #10. I was assuming that the problem requires to repeat the principal step of the proof of Gram-Schmidt here explicitly. Anyway, all parts are written here somewhere, just not in the same post. So gather them and you have the proof.
 
  • #16
890
38
Let's consider
## B_1 =\{u_1, u_2,...u_p\} ~~~~~~~~~...(1)
\\B_2 = \{v_1, v_2, v_3,...v_q\}~~~~~~~...(2)
\\p = n_1 , ~ q=n_2 ~~~~~~~~~~~~~~...(3), ##where ## B_1## and ## B_2## are basis of ## V_1## and ## V_2##.

Taking ## B= B_1 \cup B_2 = \{ u_1, u_2,...u_p,v_1, v_2, v_3,...v_q\}## ...(4)
Considering the following linear combination of basis vectors of ## V_1##,
##| c_1 u_1 + c_2u_2 +... +c_p u_p \rangle =0 \Rightarrow \{c_i\}= 0, i=1,2,...p ## ...(5)
Taking dot product with ## \langle u_1|## gives,
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle = 0## ...(6)
A) Let's assume that the dimensionality of ## V_1 + V_2 = n \lt p+q## ...(7)
This means that there exists at least one element say ## u_1 ## which is not linearly independent of other vectors present in B.
##| c_1 u_1 + c_2u_2 +... +c_p u_p + d_1v_1 +d_2 v_2+...+ d_q v_q\rangle = |0\rangle ~,c_1 \neq 0 ...(8)##
Taking dot product with ## \langle u_1|## gives,
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle \neq 0## ...(9)
Thus, (9) contradicts (6) ## \Rightarrow ## there does not exist a single element in B which is linearly dependent on other vectors of B. Thus, all the elements in B is linearly independent. Hence, n is not less than p+q.

B) Let's assume that the dimensionality of ## V_1 + V_2 = n \gt p+q## ...(9)
This means that there exists at least one element say ## |w \rangle## which is linearly independent of all the vectors present in B.
##|cw+ c_1 u_1 + c_2u_2 +... +c_p u_p + d_1v_1 +d_2 v_2+...+ d_q v_q\rangle = |0\rangle ~~~...(10)
\\ \Rightarrow\{c, c_i, d_j\} = 0 ## for i = 1,2,...p and j=1,2,...q
According to the definition of ## V_1 +V_2 ##, ## |w\rangle## must be a linear combination of elements of ## V_1 ## and ## V_2 ##.
This implies That ## |w\rangle## must be a linear combination of vectors of B.
Hence, n is not greater than p+q.
Thus, n = p+q = ##n_1 +n_2 ##.
 
Last edited:
  • #17
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
12,371
5,154
@Pushoam would it have been simpler to consider orthogonal bases?

The logic around equations 5 and 6 does not look sound.
 
  • #18
890
38
@Pushoam would it have been simpler to consider orthogonal bases?
I did this in post # 11.

Gram - Schimdt theorem says that we can have an orthogonal basis for any vector space. Consequently Vn11V1n1 \mathbb{ V_1^{n_1}} and Vn22V2n2 \mathbb{ V_2^{n_2}} consists of n1n1 n_1 and n2n2 n_2 orthogonal basis vectors.
Since any vector of Vn11V1n1 V_1^{n_1} is orthogonal to any vector of Vn22V2n2V_2^{n_2} , Vn11+Vn22V1n1+V2n2 V_1^{n_1} + V_2^{n_2} must consist of n1+n2n1+n2 n_1+n_2 orthogonal vectors. So, the dimension of Vn11+Vn22V1n1+V2n2 \mathbb{ V_1^{n_1}} + \mathbb { V_2^{n_2}} can't be less than n1+n2n1+n2 n_1+n_2 . But, why can't it be greater than n1+n2n1+n2 n_1+n_2 ?
But this is what fresh_42 suggested.
If you assume, that the basis in V1V1V_1 and V2V2V_2 are already orthogonal, then you're done by the argument in post #11. But the problem in the first post doesn't say this. So you also need what you wrote in post #10. I was assuming that the problem requires to repeat the principal step of the proof of Gram-Schmidt here explicitly. Anyway, all parts are written here somewhere, just not in the same post. So gather them and you have the proof.
Why doesn't the above latex code show properly?
I tapped the Reply button after selecting the quote.
 
  • #19
890
38
The logic around equations 5 and 6 does not look sound.
Equations 5 and 6 comes from definition of linear independence of vectors and dot product of vectors.
What is handwavy in the argument?
 
  • #20
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
12,371
5,154
Apologies, I didn't read all those posts.

Proving the Gram-Schmidt process, as it were, seems a bit over the top to me.

In that case, I would be tempted to prove that every finite vector space has an orthogonal basis as a separate theorem. That seems cleaner to me.

Also, if you get that wrong you may still get credit for applying that theorem to solve the problem.

That would be my practical approach in an exam, say.
 
  • #21
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
12,371
5,154
Equations 5 and 6 comes from definition of linear independence of vectors and dot product of vectors.
What is handwavy in the argument?
There's nothing handwavy. You simply assume that the basis vectors are linearly dependent, which is not right.
 
  • #22
890
38
There's nothing handwavy. You simply assume that the basis vectors are linearly dependent, which is not right.
I took: Basis vectors are linearly independent. Please see that again.
 
  • #23
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
12,371
5,154
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle = 0## ...(6)
This equation assumes the ##u_i## are linearly dependent.

PS I see you've added the ##=0## under my nose! It's still illogical.

If the basis vectors are linearly independent then all the ##c_i## are zero.
 
  • #24
890
38
This equation assumes the ##u_i## are linearly dependent.
Why so?
This equation says that ## c_1 = 0##. So, it is expressing linear independence.
 
  • #25
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
12,371
5,154
Why so?
This equation says that ## c_1 = 0##. So, it is expressing linear independence.
What are the other coefficients, then? You conveniently dropped the fact that they are all ##0## as well.
 

Related Threads on Dimensionality of the sum of subspaces

  • Last Post
Replies
0
Views
1K
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
4
Views
1K
Replies
4
Views
15K
  • Last Post
Replies
4
Views
2K
Replies
6
Views
652
  • Last Post
Replies
1
Views
4K
  • Last Post
Replies
7
Views
2K
  • Last Post
Replies
1
Views
3K
Replies
1
Views
2K
Top