Dimensionality of the sum of subspaces

In summary, we are given two subspaces, ## \mathbb {V}_1^{n_1} ## and ## \mathbb {V}_2^{n_2} ##, with the property that any element of ## \mathbb {V}_1^{n_1} ## is orthogonal to any element of ## \mathbb {V}_2^{n_2} ##. We need to show that the dimensionality of the sum of these two subspaces, ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ##, is equal to ## n_1 + n_2 ##. To prove this, we can express any element
  • #1
Pushoam
962
51

Homework Statement



Suppose that ## \mathbb {V}_1^{n_1} ## and ## \mathbb {V}_2^{n_2} ## are two subspaces such that any element of ## \mathbb {V}_1^{n_1} ## is orthogonal to any element of ## \mathbb {V}_2^{n_2} ## . Show that dimensionality of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## is ## n_1 +n_2##.

Homework Equations

The Attempt at a Solution


Any element ## V_{1+2} ## of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## can be expressed as a vector sum of ## V_1 ## and ## V_2 ## where ## V_1 \in \mathbb {V_1 ^{n_1}} ## and ## V_2 \in \mathbb { V_2 ^{n_2}} ##.
## V_{1+2} = V_1 +V_2 ##
Since ## V_1 ## and ## V_2 ## are orthogonal to each other, these are linearly independent.

## V_1 = \sum_{i=1}^{n_1} v_i \alpha_i ##, where ## \alpha_i ## are basis vectors of ## \mathbb {V _1^{n_1} }## and ## \alpha_i ## are the corresponding coefficients.

## V_2= \sum_{i=1}^{n _2}w_i \beta_i ##, where ## \beta_i ## are basis vectors of ## \mathbb {V _2^{n_2} }## and ## \beta_i ## are the corresponding coefficients.

Then,
## V_1 + V_2= \sum_{i=1}^{n_1} v_i \alpha_i + \sum_{i=1}^{n _2}w_i \beta_i ##, where ## \alpha_i ## and ## \beta_i ## are basis vectors of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ##.

Thus, the basis of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## consists of ## n_1 +n_2 ## linearly independent vectors. Hence, dimensionality of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## is ## n_1 +n_2 ##.

Is this correct?
 
Physics news on Phys.org
  • #2
Pushoam said:

Homework Statement



Suppose that ## \mathbb {V}_1^{n_1} ## and ## \mathbb {V}_2^{n_2} ## are two subspaces such that any element of ## \mathbb {V}_1^{n_1} ## is orthogonal to any element of ## \mathbb {V}_2^{n_2} ## . Show that dimensionality of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## is ## n_1 +n_2##.

Homework Equations

The Attempt at a Solution


Any element ## V_{1+2} ## of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## can be expressed as a vector sum of ## V_1 ## and ## V_2 ## where ## V_1 \in \mathbb {V_1 ^{n_1}} ## and ## V_2 \in \mathbb { V_2 ^{n_2}} ##.
## V_{1+2} = V_1 +V_2 ##
Since ## V_1 ## and ## V_2 ## are orthogonal to each other, these are linearly independent.
A set of vectors can be linearly independent, but not subspaces.
Pushoam said:
## V_1 = \sum_{i=1}^{n_1} v_i \alpha_i ##, where ## \alpha_i ## are basis vectors of ## \mathbb {V _1^{n_1} }## and ## \alpha_i ## are the corresponding coefficients.
##V_1## isn't a single sum -- it's the set of all possible linear combinations of the basis vectors.
Pushoam said:
## V_2= \sum_{i=1}^{n _2}w_i \beta_i ##, where ## \beta_i ## are basis vectors of ## \mathbb {V _2^{n_2} }## and ## \beta_i ## are the corresponding coefficients.
Same as above.
Pushoam said:
Then,
## V_1 + V_2= \sum_{i=1}^{n_1} v_i \alpha_i + \sum_{i=1}^{n _2}w_i \beta_i ##, where ## \alpha_i ## and ## \beta_i ## are basis vectors of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ##.

Thus, the basis of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## consists of ## n_1 +n_2 ## linearly independent vectors. Hence, dimensionality of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## is ## n_1 +n_2 ##.

Is this correct?
I don't think so.
Write the equation ##c_1\vec {u_1} + c_2 \vec {u_2} + \dots + c_{n_1}\vec{u_{n_1}} + d_1\vec{v_1} + d_2 \vec {v_2} + \dots + d_{n_2}\vec{v_{n_2}} = \vec 0##.
Show that the only solution in the constants ##c_i, d_i## is the trivial solution, using the fact that each vector in ##V_1## is orthogonal to each vector in ##V_2##.
 
  • #3
Mark44 said:
A set of vectors can be linearly independent, but not subspaces.
## V_1 ## and ## V_2 ## are vectors corresponding to subspaces ## \mathbb {V}_1^{n_1} ## and ## \mathbb {V}_2^{n_2}##.

Pushoam said:
Any element ## V_{1+2} ## of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2}## can be expressed as a vector sum of ## V_1## and ## V_2 ##where ## V_1 \in \mathbb {V_1 ^{n_1}}## and ## V_2 \in \mathbb { V_2 ^{n_2}} ##.
## V_{1+2} = V_1 +V_2##
Mark44 said:
I don't think so.
Why?
 
  • #4
For ##V_1## and ##V_2## I misunderstood your notation. Vectors are typically written with lowercase letters, such as ##\vec v## rather than ##\vec V##.
Pushoam said:
Why?
Your argument seems somewhat like handwaving to me, which is why I suggested that you use the definition of linearly independent vectors.
 
  • #5
Are you allowed to use dimension formulas? What do you know about orthogonality and the properties of the inner product? Short: Had you filled out section 2 of the template, the discussion could be significantly shorter. It is always a good strategy to gather what you have, before you start reasoning.
 
  • Like
Likes Pushoam
  • #6
fresh_42 said:
Are you allowed to use dimension formulas? What do you know about orthogonality and the properties of the inner product?
I have not studied dimension formula yet.
Two vectors are orthogonal if their inner product is zero.
 
  • #7
Mark44 said:
Write the equation ##c_1 \vec {u_1} + c_2 \vec {u_2} + \dots + c_{n_1}\vec{u_{n_1}} + d_1\vec{v_1} + d_2 \vec {v_2} + \dots + d_{n_2}\vec{v_{n_2}} = \vec 0##.
Show that the only solution in the constants ##c_i, d_i ## is the trivial solution, using the fact that each vector in ##V_1## is orthogonal to each vector in ##V_2##.
This proves that ##\vec {u_i} ## and ##\vec {v_i}## are linearly independent. This does not prove that basis of ## \mathbb { V_1^{n_1}} ## consists of ## n_1 + n_2 ## vectors.
 
  • #8
Pushoam said:
I have not studied dimension formula yet.
A pity. It would have made the proof easier. Nevertheless, you have ##\dim (V_1+V_2) \leq n_1+n_2##. Now we want to show that equality holds. So assume the opposite, i.e. let's have ##\dim (V_1+V_2) < n_1+n_2\,.## What does this mean, given two bases for ##V_1## and ##V_2##?
Two vectors are orthogonal if their inner product is zero.
... plus the properties of the inner product!
 
  • #9
Pushoam said:
This proves that ##\vec {u_i} ## and ##\vec {v_i}## are linearly independent. This does not prove that basis of ## \mathbb { V_1^{n_1}} ## consists of ## n_1 + n_2 ## vectors.
I think you meant ##V_{1 + 2}## (which I would write more simply as just ##V##. If you can show that the equation I wrote has only the trivial solution, that shows that all of the basis vectors in ##V_1## and all of the basis vectors in ##V_2## are linearly independent. Since all of these vectors span ##V## (or ##V_{1 + 2}##), the u and v vectors are a basis for ##V##. How many vectors are there all together? That's the dimension of your space V.
 
Last edited:
  • #10
fresh_42 said:
A pity. It would have made the proof easier. Nevertheless, you have ##\dim (V_1+V_2) \leq n_1+n_2##. Now we want to show that equality holds. So assume the opposite, i.e. let's have ##\dim (V_1+V_2) < n_1+n_2\,.## What does this mean, given two bases for ##V_1## and ##V_2##?

... plus the properties of the inner product!

Gram - Schimdt theorem says that we can have an orthogonal basis for any vector space. Consequently ## \mathbb{ V_1^{n_1}} ## and ## \mathbb{ V_2^{n_2}} ## consists of ## n_1 ## and ## n_2## orthogonal basis vectors.
Since any vector of ## V_1^{n_1} ## is orthogonal to any vector of ##V_2^{n_2} ##, ## V_1^{n_1} + V_2^{n_2} ## must consist of ## n_1+n_2 ## orthogonal vectors. So, the dimension of ## \mathbb{ V_1^{n_1}} + \mathbb { V_2^{n_2}} ## can't be less than ## n_1+n_2 ## . But, why can't it be greater than ## n_1+n_2 ## ?
 
Last edited by a moderator:
  • #11
Mark44 said:
I think you meant ##V_{1 + 2}## (which I would write more simply as just ##V##. If you can show that the equation I wrote has only the trivial solution, that shows that all of the basis vectors in ##V_1## and all of the basis vectors in ##V_2## are linearly independent. Since all of these vectors span ##V## (or ##V_{1 + 2}##), the u and v vectors are a basis for ##V##. How many vectors are there all together? That's the dimension of your space V.
##c_1\vec {u_1} + c_2 \vec {u_2} + \dots + c_{n_1}\vec{u_{n_1}} + d_1\vec{v_1} + d_2 \vec {v_2} + \dots + d_{n_2}\vec{v_{n_2}} = \vec 0 ##
Taking dot product with ## \vec u_i ## gives ## c_i = 0## as all other vectors are orthogonal to ## \vec u_i ##. Similarly, ## d_i = 0##.
 
  • #12
Pushoam said:
Gram - Schimdt theorem says that we can have an orthogonal basis for any vector space. Consequently ## \mathbb{ V_1^{n_1}} ## and ## \mathbb{ V_2^{n_2}} ## consists of ## n_1 ## and ## n_2## orthogonal basis vectors.
Since any vector of ## V_1^{n_1} ## is orthogonal to any vector of ##V_2^{n_2} ##, ## V_1^{n_1} + V_2^{n_2} ## must consist of ## n_1+n_2 ## orthogonal vectors. So, the dimension of ## \mathbb{ V_1^{n_1}} + \mathbb { V_2^{n_2}} ## can't be less than ## n_1+n_2 ## .
This is not nearly as obvious as you pretend and the point where @Mark44 said "hand wavy".

If ##A## is a set of linear independent vectors, say e.g. ##A=\{\,(1,0),(0,1)\,\}## in ##\mathbb{R}^2## and ##B## a set of linear independent vectors, say ##B=\{\,(1,1)\,\}## in ##\mathbb{R}^2##, do you think ##\{\,(1,0),(0,1),(1,1)\,\}## are linear independent? From "since they are orthogonal" to "##\dim(V_1+V_2) \geq n_1+n_2\,##" is exactly what you need to prove.
But, why can't it be greater than ## n_1+n_2 ## ?
This is actually the trivial part, not the other one.

Let me change your notation first:
Usually scalars are written by Greek letters and vectors by Latin letters, so just the other way around than you did. It's not important but to keep conventions makes it a lot easier to read. So let ##\{\,v_1,\ldots ,v_{n_1}\,\} \subseteq V_1\, , \,\{\,w_1,\ldots ,w_{n_2}\,\} \subseteq V_2## be the basis vectors.

Assume ##\dim (V_1+V_2) > n_1+n_2## and let ##\{\,u,v_1,\ldots ,v_{n_1},w_1,\ldots,w_{n_2}\,\}## be linear independent. Now show me how ##u\in \operatorname{span}\{\,v_1,\ldots ,v_{n_1},w_1,\ldots,w_{n_2}\,\}## is possible!

Back to the proof.
Assume ##\dim (V_1+V_2) < n_1+n_2## and w.l.o.g. ##v_1 \in \operatorname{span}\{\,v_2,\ldots ,v_{n_1},w_1,\ldots,w_{n_2}\,\}\,.## How would you proceed?
 
  • Like
Likes Pushoam
  • #13
fresh_42 said:
If A is a set of linear independent vectors, say e.g. ##A=\{\,(1,0),(0,1)\,\} ## in ##\mathbb{R}^2 ## and B a set of linear independent vectors, say ##B=\{\,(1,1)\,\} in ##\mathbb{R}^2##, do you think ##\{\,(1,0),(0,1),(1,1)\,\} ##are linear independent?
No, ##\{\,(1,0),(0,1),(1,1)\,\} ## are not linearly independent. It is because each element of A is not orthogonal to each element of B. Thanks for giving me this example.
In the OP question, each element of##V_1## is orthogonal to each element of ##V_2##. So, a set consisting of orthogonal basis vectors of both ## V_1## and ## V_2## will be linearly independent.
This is what I wrote in the below post:
Pushoam said:
Since any vector of ##V_1^{n_1}## is orthogonal to any vector of ##V_2^{n_2}## , ##V_1^{n_1} + V_2^{n_2}## must consist of ## n_1+n_2## orthogonal vectors. So, the dimension of ## \mathbb{ V_1^{n_1}} + \mathbb { V_2^{n_2}} ##can't be less than ## n_1+n_2## .

What is meant by w.l.o.g.?
 
Last edited:
  • #14
Pushoam said:
Suppose that ## \mathbb {V}_1^{n_1} ## and ## \mathbb {V}_2^{n_2} ## are two subspaces such that any element of ## \mathbb {V}_1^{n_1} ## is orthogonal to any element of ## \mathbb {V}_2^{n_2} ## . Show that dimensionality of ## \mathbb {V}_1^{n_1} + \mathbb {V}_2^{n_2} ## is ## n_1 +n_2##.

Pushoam, I think Fresh_42 is saying you need to start like this:
- assume ##dim(V) < n_1 + n_2##, then...
- assume ##dim(V) > n_1 + n_2##, then...
 
  • #15
Pushoam said:
What is meant by w.l.o.g.?
It means without loss of generality, i.e. since one vector is linear dependent of the others by assumption, we can assume it is the first one, as otherwise we would just renumber them accordingly.

If you assume, that the basis in ##V_1## and ##V_2## are already orthogonal, then you're done by the argument in post #11. But the problem in the first post doesn't say this. So you also need what you wrote in post #10. I was assuming that the problem requires to repeat the principal step of the proof of Gram-Schmidt here explicitly. Anyway, all parts are written here somewhere, just not in the same post. So gather them and you have the proof.
 
  • #16
Let's consider
## B_1 =\{u_1, u_2,...u_p\} ~~~~~~~~~...(1)
\\B_2 = \{v_1, v_2, v_3,...v_q\}~~~~~~~...(2)
\\p = n_1 , ~ q=n_2 ~~~~~~~~~~~~~~...(3), ##where ## B_1## and ## B_2## are basis of ## V_1## and ## V_2##.

Taking ## B= B_1 \cup B_2 = \{ u_1, u_2,...u_p,v_1, v_2, v_3,...v_q\}## ...(4)
Considering the following linear combination of basis vectors of ## V_1##,
##| c_1 u_1 + c_2u_2 +... +c_p u_p \rangle =0 \Rightarrow \{c_i\}= 0, i=1,2,...p ## ...(5)
Taking dot product with ## \langle u_1|## gives,
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle = 0## ...(6)
A) Let's assume that the dimensionality of ## V_1 + V_2 = n \lt p+q## ...(7)
This means that there exists at least one element say ## u_1 ## which is not linearly independent of other vectors present in B.
##| c_1 u_1 + c_2u_2 +... +c_p u_p + d_1v_1 +d_2 v_2+...+ d_q v_q\rangle = |0\rangle ~,c_1 \neq 0 ...(8)##
Taking dot product with ## \langle u_1|## gives,
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle \neq 0## ...(9)
Thus, (9) contradicts (6) ## \Rightarrow ## there does not exist a single element in B which is linearly dependent on other vectors of B. Thus, all the elements in B is linearly independent. Hence, n is not less than p+q.

B) Let's assume that the dimensionality of ## V_1 + V_2 = n \gt p+q## ...(9)
This means that there exists at least one element say ## |w \rangle## which is linearly independent of all the vectors present in B.
##|cw+ c_1 u_1 + c_2u_2 +... +c_p u_p + d_1v_1 +d_2 v_2+...+ d_q v_q\rangle = |0\rangle ~~~...(10)
\\ \Rightarrow\{c, c_i, d_j\} = 0 ## for i = 1,2,...p and j=1,2,...q
According to the definition of ## V_1 +V_2 ##, ## |w\rangle## must be a linear combination of elements of ## V_1 ## and ## V_2 ##.
This implies That ## |w\rangle## must be a linear combination of vectors of B.
Hence, n is not greater than p+q.
Thus, n = p+q = ##n_1 +n_2 ##.
 
Last edited:
  • #17
@Pushoam would it have been simpler to consider orthogonal bases?

The logic around equations 5 and 6 does not look sound.
 
  • #18
PeroK said:
@Pushoam would it have been simpler to consider orthogonal bases?
I did this in post # 11.

Pushoam said:
Gram - Schimdt theorem says that we can have an orthogonal basis for any vector space. Consequently Vn11V1n1 \mathbb{ V_1^{n_1}} and Vn22V2n2 \mathbb{ V_2^{n_2}} consists of n1n1 n_1 and n2n2 n_2 orthogonal basis vectors.
Since any vector of Vn11V1n1 V_1^{n_1} is orthogonal to any vector of Vn22V2n2V_2^{n_2} , Vn11+Vn22V1n1+V2n2 V_1^{n_1} + V_2^{n_2} must consist of n1+n2n1+n2 n_1+n_2 orthogonal vectors. So, the dimension of Vn11+Vn22V1n1+V2n2 \mathbb{ V_1^{n_1}} + \mathbb { V_2^{n_2}} can't be less than n1+n2n1+n2 n_1+n_2 . But, why can't it be greater than n1+n2n1+n2 n_1+n_2 ?
But this is what fresh_42 suggested.
fresh_42 said:
If you assume, that the basis in V1V1V_1 and V2V2V_2 are already orthogonal, then you're done by the argument in post #11. But the problem in the first post doesn't say this. So you also need what you wrote in post #10. I was assuming that the problem requires to repeat the principal step of the proof of Gram-Schmidt here explicitly. Anyway, all parts are written here somewhere, just not in the same post. So gather them and you have the proof.

Why doesn't the above latex code show properly?
I tapped the Reply button after selecting the quote.
 
  • #19
PeroK said:
The logic around equations 5 and 6 does not look sound.
Equations 5 and 6 comes from definition of linear independence of vectors and dot product of vectors.
What is handwavy in the argument?
 
  • #20
Apologies, I didn't read all those posts.

Proving the Gram-Schmidt process, as it were, seems a bit over the top to me.

In that case, I would be tempted to prove that every finite vector space has an orthogonal basis as a separate theorem. That seems cleaner to me.

Also, if you get that wrong you may still get credit for applying that theorem to solve the problem.

That would be my practical approach in an exam, say.
 
  • Like
Likes Pushoam
  • #21
Pushoam said:
Equations 5 and 6 comes from definition of linear independence of vectors and dot product of vectors.
What is handwavy in the argument?

There's nothing handwavy. You simply assume that the basis vectors are linearly dependent, which is not right.
 
  • #22
PeroK said:
There's nothing handwavy. You simply assume that the basis vectors are linearly dependent, which is not right.
I took: Basis vectors are linearly independent. Please see that again.
 
  • #23
Pushoam said:
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle = 0## ...(6)
This equation assumes the ##u_i## are linearly dependent.

PS I see you've added the ##=0## under my nose! It's still illogical.

If the basis vectors are linearly independent then all the ##c_i## are zero.
 
  • #24
PeroK said:
This equation assumes the ##u_i## are linearly dependent.
Why so?
This equation says that ## c_1 = 0##. So, it is expressing linear independence.
 
  • #25
Pushoam said:
Why so?
This equation says that ## c_1 = 0##. So, it is expressing linear independence.
What are the other coefficients, then? You conveniently dropped the fact that they are all ##0## as well.
 
  • #26
Pushoam said:
Considering the following linear combination of basis vectors of ## V_1##,
##| c_1 u_1 + c_2u_2 +... +c_p u_p \rangle =0 \Rightarrow \{c_i\}= 0, i=1,2,...p ##...(5)
Taking dot product with##\langle u_1| ##gives,
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle = 0 ##...(6)

I had already stated that all other coefficients are 0. Look ## \{c_i\}= 0, i=1,2,...p ##
I wrote eqn. (6) this way because I used it later in comparing with eqn. (9).
 
  • #27
Pushoam said:
I had already stated that all other coefficients are 0. Look ## \{c_i\}= 0, i=1,2,...p ##
I wrote eqn. (6) this way because I used it later in comparing with eqn. (9).
Okay, there are two things here. Your approach to getting a contradiction is misguided. For example, you haven't used the same ##c_1## throughout.

Even if equations 6 and 9 were arrived at validly, all you have is that in one case you have a coefficient that is 0 and in another case a different coefficient that is not 0. But, nothing ties these two coefficients together, except that you decided to call them both ##c_1##.

Second, the definition of linear independence has an implication in it. To generate a contradiction, you would have to find coefficients ##c_i## with ##c_1 \ne 0## and ##\Sigma c_i u_i = 0##

It's difficult to explain further, but your logic is not correct here.

PS it's a common mistake you are making.
 
  • #28
To put it another way, equation 6 says ##0 = 0## and equation 9 says that you've found some non zero coefficient, ##c_1##.

Those don't contradict each other.
 
  • #29
Thanks.
 
  • #30
You need (6) for the contradiction in (9). But (6) is only true, if the ##u_i## are orthonormal, since for an arbitrary vector as in (9) you need/used ##\langle u_i,u_j \rangle = \delta_{ij}##. You can use this by the theorem, but you should mention it, because it is not given by the problem itself. So you have to say why you can make this assumption. Otherwise you only have ##\langle u_i,v_j \rangle = 0## and nothing about the inner product between the ##u_i## or ##v_j##.
 
  • #31
fresh_42 said:
You need (6) for the contradiction in (9). But (6) is only true, if the ##u_i## are orthonormal, since for an arbitrary vector as in (9) you need/used ##\langle u_i,u_j \rangle = \delta_{ij}##. You can use this by the theorem, but you should mention it, because it is not given by the problem itself. So you have to say why you can make this assumption. Otherwise you only have ##\langle u_i,v_j \rangle = 0## and nothing about the inner product between the ##u_i## or ##v_j##.
(6) comes from assuming ##\Sigma c_i u_i = 0##
 
  • #32
PeroK said:
(6) comes from assuming ##\Sigma c_i u_i = 0##
Yes, but in (9) he has an arbitrary vector which is not zero. They are not the same coefficients ##c_i##.
In other words: (6) must not be applied in (9) because the conditions of (6) aren't given in (9).
 
  • #33
Pushoam said:
Let's consider

## B_1 =\{u_1, u_2,...u_p\} ~~~~~~~~~...(1)
\\B_2 = \{v_1, v_2, v_3,...v_q\}~~~~~~~...(2)
\\p = n_1 , ~ q=n_2 ~~~~~~~~~~~~~~...(3), ##where ## B_1## and ## B_2## are basis of ## V_1## and ## V_2##.

Taking ## B= B_1 \cup B_2 = \{ u_1, u_2,...u_p,v_1, v_2, v_3,...v_q\}## ...(4)
Considering the following linear combination of basis vectors of ## V_1##,
##| c_1 u_1 + c_2u_2 +... +c_p u_p \rangle =0 \Rightarrow \{c_i\}= 0, i=1,2,...p ## ...(5)
Taking dot product with ## \langle u_1|## gives,
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle = 0## ...(6)


A) Let's assume that the dimensionality of ## V_1 + V_2 = n \lt p+q## ...(7)
This means that there exists at least one element say ## u_1 ## which is not linearly independent of other vectors present in B.
##| c_1 u_1 + c_2u_2 +... +c_p u_p + d_1v_1 +d_2 v_2+...+ d_q v_q\rangle = |0\rangle ~,c_1 \neq 0 ...(8)##
Taking dot product with ## \langle u_1|## gives,
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle \neq 0## ...(9)
Thus, (9) contradicts (6) ⇒⇒ \Rightarrow there does not exist a single element in B which is linearly dependent on other vectors of B. Thus, all the elements in B is linearly independent. Hence, n is not less than p+q.

My mistake in the above proof is pointed below:
A) Let's assume that the dimensionality of ## V_1 + V_2 = n \lt p+q## ...(7)
This means that there exists at least one element say ## u_1 ## which is not linearly independent of other vectors present in B.
##| c_1 u_1 + c_2u_2 +... +c_p u_p + d_1v_1 +d_2 v_2+...+ d_q v_q\rangle = |0\rangle ~,c_1 \neq 0 ...(8)##
Taking dot product with ## \langle u_1|## gives,
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle \neq 0## ...(9)
Now consider ## c_1 u_1 + c_2u_2 +... +c_p u_p ## of L.H.S. of (8). As ## u_1, u_2 , u_3 ,... u_p ## are basis vectors of ## V_1 ##, these are linearly independent.
So,
##| c_1 u_1 + c_2u_2 +... +c_p u_p \rangle =0 \Rightarrow \{c_i\}= 0, i=1,2,...p ## ...(5)
Taking dot product with ## \langle u_1|## gives,
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle = 0## ...(6) This is what I meant in the earlier calculation.
But the point is eqn (8) does not imply ## c_1 u_1 + c_2u_2 +... +c_p u_p = 0 ## So, eqn. (5) and (6) are not valid here. Now I see it. Thanks.

But, then I see only one way to prove. Using Gram Schmidt theorem ,I state that ##V_1## and ##V_2## have orthonormal basis. And then use this basis to prove as I did in post ## 10 and 11.
 
  • #34
Pushoam said:
My mistake in the above proof is pointed below:
A) Let's assume that the dimensionality of ## V_1 + V_2 = n \lt p+q## ...(7)
This means that there exists at least one element say ## u_1 ## which is not linearly independent of other vectors present in B.
##| c_1 u_1 + c_2u_2 +... +c_p u_p + d_1v_1 +d_2 v_2+...+ d_q v_q\rangle = |0\rangle ~,c_1 \neq 0 ...(8)##
Taking dot product with ## \langle u_1|## gives,
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle \neq 0## ...(9)
Again. This is NOT true. With ##n < p+q## you may assume w.l.o.g. ##c_1=1## and ##u_1 = \sum_{i>1}c_iu_i + \sum_{j\geq 1}d_jv_j\,.##
  1. To mention "w.l.o.g." is important. It means that you performed a certain numbering of your basis vectors in order to have the first one as the linear dependent one. It also means, that further renumber will only be allowed, if the first one is affected again. You won't need another renumber here, but it is important that you know what you did and that this is a consequence of it.
  2. ##\langle u_1\,|\,v_j\rangle = 0## and so is $$\langle u_1\,|\, u_1\rangle = \langle u_1\,|\,\sum_{i>1}c_iu_i + \sum_{j\geq 1}d_jv_l\rangle = \sum_{i>1}c_i\langle u_1\,|\,u_i\rangle$$ because ##u_1\perp v_j\,.##
  3. Next you used without mention ##\langle u_1\,|\,u_i\rangle = \delta_{1i}\,.## Why are you allowed to do so? Nowhere up to now, i.e. in this post, is anything written about ##u_i\perp u_j \quad (i\neq j)## and the problem statement doesn't say it either! So until now, there is no argument, why ##u_1 \perp \operatorname{span}\{\,u_2,\ldots ,u_p\,\}\,.##
  4. IF you assume that the ##u_i## are orthomormal, whereas btw. orthogonal is sufficient, but you don't seem to make a difference, then you immediately have ##\langle u_1\,|\,u_1\rangle=0## which by the properties of the inner product means ##u_1=0## and thus ##\dim V_1 < p## which is the contradiction you need.
  5. To summerize: You need to say why you can use the first vector as linear dependent. Then you need to say, why you can assume the ##u_i## be orthogonal, resp. orthonormal which you used. This automatically brings you into problems. Since you first renumbered the basis, will this have an effect on the orthonormalization procedure in Gram-Schmidt? As Gram-Schmidt, as far as I remember, also uses renumbering, both of these could be in conflict and you haven't said why it is none! The trick here is - in case you want to use Gram-Schmidt - to do it first. The first thing in the entire proof should be that you require both basis to be orthonormal. Both basis, as you cannot know at prior whether ##V_1## or ##V_2## causes the defect in dimension. Selecting ##u_1## makes this choice, so that it has to come after the Gram-Schmidt argument. Any other order will make your proof invalid, except you find a different way to deal with those obstacles. Nevertheless, the proof can also be given without the usage of Gram-Schmidt, but then you will not be able to cut out single coefficients by taking the inner product.

Now consider ## c_1 u_1 + c_2u_2 +... +c_p u_p ## of L.H.S. of (8). As ## u_1, u_2 , u_3 ,... u_p ## are basis vectors of ## V_1 ##, these are linearly independent.
So,
##| c_1 u_1 + c_2u_2 +... +c_p u_p \rangle =0 \Rightarrow \{c_i\}= 0, i=1,2,...p ## ...(5)
Taking dot product with ## \langle u_1|## gives,
## c_1 = - \langle u_1 | c_2u_2 +... +c_p u_p\rangle = 0## ...(6) This is what I meant in the earlier calculation.
But the point is eqn (8) does not imply ## c_1 u_1 + c_2u_2 +... +c_p u_p = 0 ## So, eqn. (5) and (6) are not valid here. Now I see it. Thanks.

But, then I see only one way to prove. Using Gram Schmidt theorem ,I state that ##V_1## and ##V_2## has orthonormal basis. And then use this basis to prove as I did in post ## 10 and 11.
 
Last edited:
  • Like
Likes Pushoam
  • #35
@Pushoam, back in post #16 you wrote this:
Pushoam said:
Considering the following linear combination of basis vectors of ##V_1##,
##| c_1 u_1 + c_2u_2 +... +c_p u_p \rangle =0 \Rightarrow \{c_i\}= 0, i=1,2,...p ## ...(5)

It's not clear to me exactly what PeroK's objection was, but it might be this idea:
Let's look at two sets of vectors:
A: ##\{u_1 = <1, 0, 0>, u_2 = <0, 1, 0>, u_3 = <1, 1, 0>\}## and
B: ##\{v_1 = <1, 0, 0>, v_2 = <0, 1, 0>, v_3 = <0, 0, 1>\}##
For A, I write the equation ##a_1u_1 + a_2u_2 + a_3u_3 = 0## (1) , and note that ##a_1 = a_2 = a_3 = 0## is a solution. In other words,
##a_1u_1 + a_2u_2 + a_3u_3 = 0 \Rightarrow a_1 = a_2 = a_3 = 0##.
Can I conclude that the vectors ##u_1, u_2, u_3## are linearly independent? Clearly these vectors are linearly dependent, since ##u_3 = u_1 + u_2##. Therefore equation 1 has another solution; namely, ##a_1 = 1, a_2 = 1, a_3 = -1##. In fact, there are an infinite number of solutions to equation 1.

For B, I write the equation ##b_1v_1 + b_2v_2 + b_3v_3 = 0## (2), and after a bit of algebra determine that the unique solution for the constants is ##b_1 = b_2 = b_3 = 0##, thus concluding that this set of vectors is linearly independent.

Maybe you get the distinction between a linearly independent set of vectors and a linearly dependent set, but it wasn't clear in your equation 5.
 
  • Like
Likes Pushoam

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
0
Views
449
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Advanced Physics Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
14
Views
595
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
34
Views
2K
Replies
2
Views
843
  • Linear and Abstract Algebra
Replies
19
Views
2K
Back
Top