A quite verbal proof that if V is finite dimensional then S is also....

In summary: If we take a subset of the independent set in S that doesn't span S, then this subset would be dependent on the basis for V. Hence, the basis of V is smaller than that of S.That feels a bit more solid to me. But, again, let me show...Suppose we take a subset of the independent set in S that spans S. Then this subset would be dependent on the basis for V. Hence, the basis of V is the same size as the basis of S.I think my proof is good enough for homework. In summary, if a linear space ##V## is finite dimensional then ##S##, a subspace of ##V##, is also
  • #1
Hall
351
87
Homework Statement
If a linear space V is finite dimensional then S, a subspace of V, is also finite-dimensional and dim S is less than or equal to dim V.
Relevant Equations
Let A = {u_1, u_2, ... u_n} be a basis for V.
If a linear space ##V## is finite dimensional then ##S##, a subspace of ##V##, is also finite-dimensional and ##dim ~S \leq dim~V##.

Proof: Let's assume that ##A = \{u_1, u_2, \cdots u_n\}## be a basis for ##V##. Well, then any element ##x## of ##V## can be represented as
$$
x = \sum_{i=1}^{n} c_i u_i
$$
As ## S \subset V## therefore, all the elements of ##S## can also be represented by linear combinations of ##u_i##s. As, ##u_i## s are finite, this implies that ##S## is finite dimensional.

Let's say ##dim~S \gt dim~V##. Then, a basis for ##S## would look like
$$
A' = \{ v_1, v_2, \cdots v_m\}
$$
where ##m \gt n##. But that would imply that ##A'## is independent thus contradicting that "any set of (n+1) elements in V would be dependent if ##L(A) = V## and number of elements in A is n". Hence, ##dim ~S \leq dim~V##.

Thats ends my proof.

It seems to me that my proof of the first part is quite verbal and needs to be a little more rigorous but the question is: Was the reasoning correct? And how to make it rigorous?
 
Physics news on Phys.org
  • #2
Hall said:
Homework Statement:: If a linear space V is finite dimensional then S, a subspace of V, is also finite-dimensional and dim S is less than or equal to dim V.
Relevant Equations:: Let A = {u_1, u_2, ... u_n} be a basis for V.

If a linear space ##V## is finite dimensional then ##S##, a subspace of ##V##, is also finite-dimensional and ##dim ~S \leq dim~V##.

Proof: Let's assume that ##A = \{u_1, u_2, \cdots u_n\}## be a basis for ##V##. Well, then any element ##x## of ##V## can be represented as
$$
x = \sum_{i=1}^{n} c_i u_i
$$
As ## S \subset V## therefore, all the elements of ##S## can also be represented by linear combinations of ##u_i##s. As, ##u_i## s are finite, this implies that ##S## is finite dimensional.
I agree with this, but why does this imply ##S## is finite dimensional? What's the definition of finite dimensional?

Hall said:
Let's say ##dim~S \gt dim~V##. Then, a basis for ##S## would look like
$$
A' = \{ v_1, v_2, \cdots v_m\}
$$
where ##m \gt n##. But that would imply that ##A'## is independent thus contradicting that "any set of (n+1) elements in V would be dependent if ##L(A) = V## and number of elements in A is n". Hence, ##dim ~S \leq dim~V##.

This is okay, but how do you know that "independent in ##S##" implies "independent in ##V##".

In fact, if you show this statement, then the result follows immediately. And, you'll have a very short proof.
 
  • #3
PeroK said:
I agree with this, but why does this imply S is finite dimensional?
Because it can be spanned by a a finite set which is independent. But, yes, the problem is those ##u_i## s' linear combinations surpass S.
PeroK said:
What's the definition of finite dimensional?
Having a basis set which is finite (have finite number of elements). Or to put it more elaborately: if there exists a finite set which is indpendent and spans S, then S is said to be finite dimensional.
 
  • #4
Hall said:
Because it can be spanned by a a finite set which is independent. But, yes, the problem is those ##u_i## s' linear combinations surpass S.
Okay, but one problem is that the vectors in the basis for ##V## may not be in the subspace ##S## at all. Maybe starting with a basis for ##V## is not the right approach?
 
  • #5
PeroK said:
This is okay, but how do you know that "independent in S" implies "independent in V".
Yes, the implication will not be true always.
 
  • #6
Hall said:
Yes, the implication will not be true always.
It is true. That is the heart of the proof.
 
  • #7
PeroK said:
It is true. That is the heart of the proof.
Well, I mean a set is independent completely on its own accord, no matter whose subset it is. If a set is independent, then none of its elements can be represented as linear combination of its other elements.
 
  • Like
Likes PeroK
  • #8
Hall said:
Well, I mean a set is independent completely on its own accord, no matter whose subset it is. If a set is independent, then none of its elements can be represented as linear combination of its other elements.
Exactly, a linearly independent set in ##S## cannot be linearly dependent in ##V##. Can you construct a short proof from that?
 
  • #9
PeroK said:
Exactly, a linearly independent set in ##S## cannot be linearly dependent in ##V##. Can you construct a short proof from that?
Take an independent set in S that spans S, then this indpendent set would be a subset of basis for V. Hence, the basis of V is larger than that of S.
 
  • #10
Hall said:
Take an independent set in S that spans S, then this indpendent set would be a subset of basis for V. Hence, the basis of V is larger than that of S.
That feels a bit loose to me. It's not wrong, but let me show you the alternative:

Suppose the dimension of ##V## is ##n##. We cannot find more than ##n## linearly independent vectors in ##V##. Now, any linearly independent set in ##S## is linearly independent in ##V##. Therefore, we cannot find more than ##n## linearly independent vectors in ##S##. Hence, ##S## is finite dimensional and ##dim \ S \le n##.
 
  • Like
Likes Hall
  • #11
Alternatively, you could do it by contradiction:

If we have a set of ##n + 1## linearly independent vectors in ##S##, then we have a set of ##n + 1## linearly independent vectors in ##V##. Which contradicts ##dim \ V = n##.

Therefore, ##S## is finite dimensional and ##dim \ S \le dim \ V##.
 
  • #12
PeroK said:
That feels a bit loose to me. It's not wrong, but let me show you the alternative:

Suppose the dimension of ##V## is ##n##. We cannot find more than ##n## linearly independent vectors in ##V##. Now, any linearly independent set in ##S## is linearly independent in ##V##. Therefore, we cannot find more than ##n## linearly independent vectors in ##S##. Hence, ##S## is finite dimensional and ##dim \ S \le n##.
That compelled me to yell out "That's elegant".
 
  • Like
Likes PeroK
  • #13
@PeroK Can you please do something for the first part of question?
 
  • #14
Hall said:
@PeroK Can you please do something for the first part of question?
What first part?
 
  • #15
PeroK said:
What first part?
Proving that S is finite-dimensional.
 
  • #16
Hall said:
Proving that S is finite-dimensional.
We already have. I made no assumption that S was finite dimensional. I simply looked for ##n + 1## independent vectors in ##S##. I avoided the potential tangle created by looking for a basis for ##S## or a set that spans ##S##. That was part of the elegant simplicity!
 
  • #17
PeroK said:
Therefore, we cannot find more than n linearly independent vectors in S. Hence, S is finite dimensional and dim S≤n.
Yes, we cannot find more than n indpendent vectors in S, but how do we ensure that a set of independent vectors in S would span S? We can find indpendent vectors but assuming that they would span S is assuming that S is finite-dimensional.
 
  • #18
Hall said:
Yes, we cannot find more than n indpendent vectors in S, but how do we ensure that a set of independent vectors in S would span S? We can find indpendent vectors but assuming that they would span S is assuming that S is finite-dimensional.
If you really wanted to, you could:

Assume ##S## is infinite dimensional. Therefore, we can find ##n + 1## linearly independent vectors in ##S##. Contradiction. Therefore, ##S## is finite dimensional.

But, it's not necessary to do that as a separate step.

The second point is really asking how we know that ##dim \ S## is well-defined? Perhaps ##S## doesn't have a basis? That's not something we need to prove here.
 
  • Like
Likes Hall
  • #19
Although, if you want a bit more practice, you could prove this:

Let ##V## be a finite-dimensional vector space and ##S## a subspace of ##V##. Prove that ##S## has a basis.
 
  • #20
PeroK said:
V be a finite-dimensional vector space and S a subspace of V. Prove that S has a basis
Suppose ##S## doesn't have a basis, that implies that there doesn't exist any independent set in ##S## that spans it. However, any independent set in ##S## is also indpendent in ##V## and therefore, forms a subset of a basis of ##V##. Now, this subset of a basis must span a part of ##V##, that is the span of indpendent elements of ##S## forms a subspace of ##V##.

Take any independent set A (number of elements in A being less than or equal to n) in S, then ##L(A) \subset S##. As no indpendent set in S can span S, ...

I have to think harder for that.
 
  • #21
Hall said:
Suppose ##S## doesn't have a basis, that implies that there doesn't exist any independent set in ##S## that spans it. However, any independent set in ##S## is also indpendent in ##V## and therefore, forms a subset of a basis of ##V##. Now, this subset of a basis must span a part of ##V##, that is the span of indpendent elements of ##S## forms a subspace of ##V##.

Take any independent set A (number of elements in A being less than or equal to n) in S, then ##L(A) \subset S##. As no indpendent set in S can span S, ...

I have to think harder for that.
I wouldn't worry about it for this problem.
 
  • #22
PeroK said:
I wouldn't worry about it for this problem.
But it's a good and brain-demanding exercise. Should I create a new thread? I have no problem in continuing within this one.

Let's start with this one: How a subspace would look like if it doesn't have a basis?
 
  • #23
Hall said:
But it's a good and brain-demanding exercise. Should I create a new thread? I have no problem in continuing within this one.

Let's start with this one: How a subspace would look like if it doesn't have a basis?
It's more or less the same argument. You pick ##u_1 \in S##. If ##span \{u_1 \} = S## we are done. If not, we pick ##u_2 \in S - span \{u_1 \}## and ##u_3 \in S - span \{u_1, u_2 \}## etc.

This generates a set of at most ##n## linearly independent vectors that must span ##S##. I.e. we must be able to generate a finite basis for ##S##.
 
  • #24
PeroK said:
generates a set of at most n linearly independent vectors that must span S. I.e. we must be able to generate a finite basis for
I was thinking in the same line, coz if any set of those n indpendent elements doesn't span S then S won't be a subspace (however, I myself have some doubt regarding "then S won't be a subspace".)
 
  • #25
Hall said:
I was thinking in the same line, coz if any set of those n indpendent elements doesn't span S then S won't be a subspace (however, I myself have some doubt regarding "then S won't be a subspace".)
##S## being a subspace means that for any set of vectors ##u_i \in S##, we have ##span \{u_i \} \subseteq S##. That follows directly from the closure of addition and scalar multiplication on ##S##.
 
  • #26
PeroK said:
##S## being a subspace means that for any set of vectors ##u_i \in S##, we have ##span \{u_i \} \subseteq S##. That follows directly from the closure of addition and scalar multiplication on ##S##.
Got that.

Thanks for your help today.
 
  • Like
Likes PeroK
  • #27
How about using the fact that a basis for a subspace S can be extended to a basis for the whole space V ? Then if S has an infinite basis...EDIT: Though it does not seem completely trivial to prove that a subset of a finite set is necessarily finite. It's "Obviously" true, but I don't see an easy way to prove it.
 
  • Skeptical
Likes PeroK
  • #28
WWGD said:
Though it does not seem completely trivial to prove that a subset of a finite set is necessarily finite. It's "Obviously" true, but I don't see an easy way to prove it.

If you're going to worry about this, then you might as well worry about stuff like whether the natural numbers are well defined, and then you're never going to get to do any linear algebra.
 
  • Like
Likes WWGD
  • #29
WWGD said:
How about using the fact that a basis for a subspace S can be extended to a basis for the whole space V ? Then if S has an infinite basis...EDIT: Though it does not seem completely trivial to prove that a subset of a finite set is necessarily finite. It's "Obviously" true, but I don't see an easy way to prove it.
Just what are you skeptical about @PeroK? That a basis can be extended?
 
  • #30
WWGD said:
How about using the fact that a basis for a subspace S can be extended to a basis for the whole space V ? Then if S has an infinite basis...EDIT: Though it does not seem completely trivial to prove that a subset of a finite set is necessarily finite. It's "Obviously" true, but I don't see an easy way to prove it.
This is already taken care on in my approach in this thread. And, I 've already shown why starting with a basis for ##S## is an unnecessarily clumsy approach. This was, in fact, one of the ideas that the OP tried already.

Btw:

Let ##V## be a finite set of order ##n## and ##S \subseteq V##. If ##S## is infinite, then we can find ##n+1## distinct members of ##S##, hence ##n + 1## distinct members of ##V##. Which contradicts ##|V| = n##. QED
 
  • #31
PeroK said:
Okay, but one problem is that the vectors in the basis for ##V## may not be in the subspace ##S## at all. Maybe starting with a basis for ##V## is not the right approach?
You argued in #4, that starting with a basis for ##V## is not the right approach. Not that starting with a basis for ##S##, which is what I suggested, is not the right approach. At least I did not see it elsewhere. But yes, I guess then your approach is taking a nontrivial combination of n+1 basis vectors in S, which are necessarily dependent , as they live in V, so we backtrack to S and get a nontrivial combination that equals 0.
 
  • #32
WWGD said:
You argued in #4, that starting with a basis for ##V## is not the right approach. Not that starting with a basis for ##S##, which is what I suggested, is not the right approach.
And I argued in post #16 that starting with a basis for ##S## was not the right approach either:

I made no assumption that S was finite dimensional. I simply looked for ##n + 1## independent vectors in ##S##. I avoided the potential tangle created by looking for a basis for ##S## or a set that spans ##S##.
 
  • #33
... the tangle is that you have to deal with the possibility that the basis or spanning set is infinite. And, given that we need to prove that ##S## is finite dimensional, this makes things unnecessarily clumsy.

Whereas, simply taking a finite set of ##n + 1## independent vectors in ##S## is simple and elegant, as has already been established!
 
  • #34
PeroK said:
And I argued in post #16 that starting with a basis for ##S## was not the right approach either:

I made no assumption that S was finite dimensional. I simply looked for ##n + 1## independent vectors in ##S##. I avoided the potential tangle created by looking for a basis for ##S## or a set that spans ##S##.
And I agree with your argument on cardinality. I offered a similar one awhile back in Stack Echange , was downvoted and told I must use induction in the length of segments ## [n]:=\{1,2,..n\}## by hard core set theorists. I bought into it since they know way more about set theory than I do. Not sure what their quibble was. Not disagreeing with your argument, just recounting what some expect as a proof.
 
  • #35
WWGD said:
And I agree with your argument on cardinality. I offered a similar one awhile back in Stack Echange , was downvoted and told I must use induction in the length of segments ## [n]:=\{1,2,..n\}## by hard core set theorists. I bought into it since they know way more about set theory than I do. Not sure what their quibble was. Not disagreeing with your argument, just recounting what some expect as a proof.
This argument arises on here as well from time to time. My position is simple: in general a university maths degree does not begin with a year of hard-core set theory. We have to assume, therefore, that a subject like linear algebra can be taught without descending into set theory at every turn.

In any case, the OP is trying to learn Linear Algebra, not the intricacies of the foundations of mathematics.
 

1. What is the relationship between the finite dimensionality of V and S?

The proof shows that if V is finite dimensional, then S must also be finite dimensional. This means that the number of elements in both V and S is limited and can be counted.

2. How does this proof impact the study of linear algebra?

This proof is an important result in linear algebra because it helps us understand the properties of finite dimensional vector spaces and their subspaces. It also allows us to make conclusions about the dimensionality of certain spaces based on the dimensionality of others.

3. Can you provide an example of a finite dimensional vector space and its corresponding subspace?

Yes, consider the vector space R3, which consists of all ordered triples of real numbers. Its subspace R2 would consist of all ordered pairs of real numbers, which can be thought of as a plane within R3.

4. How does this proof relate to the concept of linear independence?

This proof is closely related to the concept of linear independence. It shows that if a set of vectors in a finite dimensional vector space is linearly independent, then any subset of those vectors must also be linearly independent. This is because a finite dimensional vector space has a limited number of dimensions, so adding more vectors to a linearly independent set would eventually create a linearly dependent set.

5. Is this proof applicable to all types of vector spaces?

No, this proof is specific to finite dimensional vector spaces. It does not apply to infinite dimensional vector spaces, which have different properties and require different proofs.

Similar threads

  • Calculus and Beyond Homework Help
Replies
14
Views
513
  • Calculus and Beyond Homework Help
Replies
15
Views
887
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
928
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
928
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
848
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
534
Back
Top