Proving Linear Independence of Av1, ..., Avk and Conditions for Basis in Rm

In summary, the conversation discussed the concept of linear independence and how it applies to a set of vectors in a given matrix. It was determined that in order for a set of vectors to be linearly independent, the only solution to the equation c1v1+...+CnVn=0 must be c1=c2=...=cn=0. This was shown through the example of two vectors in R2, where the solution c1=c2=0 was not the only solution and therefore the vectors were not linearly independent.
  • #1
Mark53
93
0

Homework Statement


[/B]
1. Suppose {v1, . . . , vk} is a linearly independent set of vectors in Rn and suppose A is an m × n matrix such that Nul A = {0}.
(a) Prove that {Av1, . . . , Avk} is linearly independent.
(b) Suppose that {v1, . . . , vk} is actually a basis for Rn. Under what conditions on m and n will {Av1, . . . , Avk} be a basis for Rm?

The Attempt at a Solution



a)
we know that c1V1+...+CnVn=0 as it is linearly independent

suppose that

C1AV1+...+CnAVn=0 and that A is an invertible matrix since the null space is 0 which means we can multiply both sides by A^-1 which gives c1V1+...+CnVn=0 which means that there is a trivial solution and that it is linearly independent

Is this correct?

b)
Im unsure on how to get started on this question

Thanks for any help
 
Physics news on Phys.org
  • #2
Mark53 said:

Homework Statement


[/B]
1. Suppose {v1, . . . , vk} is a linearly independent set of vectors in Rn and suppose A is an m × n matrix such that Nul A = {0}.
(a) Prove that {Av1, . . . , Avk} is linearly independent.
(b) Suppose that {v1, . . . , vk} is actually a basis for Rn. Under what conditions on m and n will {Av1, . . . , Avk} be a basis for Rm?

The Attempt at a Solution



a)
we know that c1V1+...+CnVn=0 as it is linearly independent

suppose that

C1AV1+...+CnAVn=0 and that A is an invertible matrix since the null space is 0 which means we can multiply both sides by A^-1 which gives c1V1+...+CnVn=0 which means that there is a trivial solution and that it is linearly independent

Is this correct?

b)
Im unsure on how to get started on this question

Thanks for any help
Is ##m=k##? And how can ##k## vectors be a basis of an ##n-##dimensional vector space ##\mathbb{R}^n##?
The definition of linear independence goes as follows:
##\{v_1,\ldots , v_k\}## are linear independent if ##(c_1v_1+\ldots+c_kv_k=0 \Longrightarrow c_1=\ldots=c_k=0)##
So you have to show this implication, the equation ##c_1v_1+\ldots+c_kv_k=0## alone doesn't mean anything.
Why should ##A## be invertible? Is ##m=n##? Use the linearity of ##A## instead.
 
  • #3
fresh_42 said:
Is ##m=k##? And how can ##k## vectors be a basis of an ##n-##dimensional vector space ##\mathbb{R}^n##?
The definition of linear independence goes as follows:
##\{v_1,\ldots , v_k\}## are linear independent if ##(c_1v_1+\ldots+c_kv_k=0 \Longrightarrow c_1=\ldots=c_k=0)##
So you have to show this implication, the equation ##c_1v_1+\ldots+c_kv_k=0## alone doesn't mean anything.
Why should ##A## be invertible? Is ##m=n##? Use the linearity of ##A## instead.
what does the Linearity of A mean?
 
  • #4
Linearity of a mapping, and ##A## is a linear mapping ##A\, : \,\mathbb{R}^n \longrightarrow \mathbb{R}^m##, that is ##A(v+w) = Av + Aw## and ##A(cv)=cA(v)## for ##c \in \mathbb{R}##.
 
  • #5
Mark53 said:
we know that c1V1+...+CnVn=0 as it is linearly independent
You can still write exactly the same equation if ##\{v_1, v_2, \dots, v_n\}## is a linearly dependent set. So what is it that distinguishes a linearly independent set from a linearly dependent set?

Mark53 said:
suppose that C1AV1+...+CnAVn=0 and that A is an invertible matrix
It's given that A is m x n, so you can't assume that A is invertible.
 
  • #6
Mark44 said:
You can still write exactly the same equation if ##\{v_1, v_2, \dots, v_n\}## is a linearly dependent set. So what is it that distinguishes a linearly independent set from a linearly dependent set?

it is linearly independent if ##\{c_1, c_2, \dots, c_n=0\}##
Mark44 said:
It's given that A is m x n, so you can't assume that A is invertible.

Then how would i go about showing that it is independent?
 
  • #7
Mark53 said:
it is linearly independent if ##\{c_1, c_2, \dots, c_n=0\}##

Then how would i go about showing that it is independent?

We cannot answer that without giving away the whole solution.
 
  • #8
Mark53 said:
it is linearly independent if ##\{c_1, c_2, \dots, c_n=0\}##
Consider these vectors in ##\mathbb{R}^2##: ##v_1 = <1, 1>, v_2 = <2, 2>##
Suppose that ##c_1v_1 + c_2v_2 = 0##. Clearly ##c_1 = c_2 = 0## is a solution. Does it follow that the two vectors are linearly independent?
 
  • #9
Mark44 said:
Consider these vectors in ##\mathbb{R}^2##: ##v_1 = <1, 1>, v_2 = <2, 2>##
Suppose that ##c_1v_1 + c_2v_2 = 0##. Clearly ##c_1 = c_2 = 0## is a solution. Does it follow that the two vectors are linearly independent?
yes as there is only the trivial solution
 
  • #10
Mark53 said:
yes as there is only the trivial solution
And what about ##2 \cdot v_1 + (-1)\cdot v_2##?
 
  • #11
fresh_42 said:
And what about ##2 \cdot v_1 + (-1)\cdot v_2##?
then adding the vectors would be 0 but this shouldn't effect the independence
 
  • #12
You should really repeat the concept of linearity, of vectors as well as of mappings. Two vectors are linear independent, if they don't point to the same direction. Does this hold for ##(1,1)## and ##(2,2)##? Three vectors are linear independent, if they are not all on the same line or in the same plane. The formal notation of this is
fresh_42 said:
The definition of linear independence goes as follows:
##\{v_1,\ldots , v_k\}## are linear independent if ##(c_1v_1+\ldots+c_kv_k=0 \Longrightarrow c_1=\ldots=c_k=0)##
So you have to show this implication, the equation ##c_1v_1+\ldots+c_kv_k=0## alone doesn't mean anything.
So the implication is the crucial point. Choosing ##c_i=0## is always a solution. What sense would it make to call vectors linear independent then? The point is, it has to be the only solution, for else we could always write a ##v_i= -\frac{c_1}{c_i}v_1 - \ldots -\frac{c_n}{c_i}v_n## which is clearly a linear dependence.
 
  • #13
Mark44 said:
Consider these vectors in ##\mathbb{R}^2##: ##v_1 = <1, 1>, v_2 = <2, 2>##
Suppose that ##c_1v_1 + c_2v_2 = 0##. Clearly ##c_1 = c_2 = 0## is a solution. Does it follow that the two vectors are linearly independent?

Mark53 said:
yes as there is only the trivial solution
No. ##c_1 = c_2 = 0## is a solution, but it is far from being the only solution. As fresh_42 explains, my two vectors are NOT linearly independent.
 
  • #14
Mark44 said:
No. ##c_1 = c_2 = 0## is a solution, but it is far from being the only solution. As fresh_42 explains, my two vectors are NOT linearly independent.

If Null A = {0} that means that A would be Linearly independent as there are no free variables and it has a trivial solution

Given that {v1,...;,vk} is linearly independent

Then {AV1,...AVK} would also be linearly independent as the matrix A is linearly independent

Would this be correct now or am I missing something?
 
  • #15
You cannot say "##A## is linear independent" in this context. What does it mean? You might say that the column or row vectors are linear independent but not the matrix as a whole. Actually ##Null(A)=\{0\}## means, that ##A## is injective, or "into", or an embedding, in your case ##A\, : \,\mathbb{R}^n \hookrightarrow \mathbb{R}^m##. So the column vectors of ##A## are linear independent (whereas the row vectors are not!), but do you know why? Where are no free variables? What are the variables? A trivial solution ##A\cdot \vec{0} = \vec{0}## always exists, so this cannot be a criteria.

Start at the beginning instead. You want to show, that ##\{Av_1,\ldots ,Av_k\}## are linear independent vectors, given ##\{v_1,\ldots ,v_k\}## are linear independent. What do you have to show then? Look up the definition of linear independence in #2.
In the next steps, you should use the linearity of ##A## (see definition in #4), then the condition ##Null(A)=\{0\}## and at last the fact, that the ##v_i## are linear independent. Try to proceed along these four steps.
 
  • #16
fresh_42 said:
You cannot say "##A## is linear independent" in this context. What does it mean? You might say that the column or row vectors are linear independent but not the matrix as a whole. Actually ##Null(A)=\{0\}## means, that ##A## is injective, or "into", or an embedding, in your case ##A\, : \,\mathbb{R}^n \hookrightarrow \mathbb{R}^m##. So the column vectors of ##A## are linear independent (whereas the row vectors are not!), but do you know why? Where are no free variables? What are the variables? A trivial solution ##A\cdot \vec{0} = \vec{0}## always exists, so this cannot be a criteria.

Start at the beginning instead. You want to show, that ##\{Av_1,\ldots ,Av_k\}## are linear independent vectors, given ##\{v_1,\ldots ,v_k\}## are linear independent. What do you have to show then? Look up the definition of linear independence in #2.
In the next steps, you should use the linearity of ##A## (see definition in #4), then the condition ##Null(A)=\{0\}## and at last the fact, that the ##v_i## are linear independent. Try to proceed along these four steps.

Given that ##\{v_1,\ldots ,v_k\}## is linearly independent it means that ##c_1v_1+\ldots +c_kv_k=0## which implies that ##c_1=\ldots=c_k=0##

to show that ##\{Av_1,\ldots ,Av_k\}## is linearly independent we need to show that ##c_1Av_1+\ldots +c_kAv_k=0##

Given that ##Null(A)=\{0\}## it means that the column vectors are linearly independent and given that we know that the ##V_i## are linearly independent then ##c_1=\ldots=c_k=0## which means that ##\{Av_1,\ldots ,Av_k\}## is also linearly independent

Is this better now?
 
  • #17
Mark53 said:
Given that ##\{v_1,\ldots ,v_k\}## is linearly independent it means that ##c_1v_1+\ldots +c_kv_k=0## which implies that ##c_1=\ldots=c_k=0##
As already stated, even if ##\{v_1,\ldots ,v_k\}## is linearly dependent, the equation ##c_1v_1+\ldots +c_kv_k=0## still has a solution of ##c_1=\ldots=c_k=0##. There's a subtlety here that I don't think you are getting.
Mark53 said:
to show that ##\{Av_1,\ldots ,Av_k\}## is linearly independent we need to show that ##c_1Av_1+\ldots +c_kAv_k=0##
No. You can always set up this equation, whether or not ##\{Av_1, \dots, Av_k \} is a linearly independent set.
Mark53 said:
Given that ##Null(A)=\{0\}## it means that the column vectors are linearly independent and given that we know that the ##V_i## are linearly independent then ##c_1=\ldots=c_k=0## which means that ##\{Av_1,\ldots ,Av_k\}## is also linearly independent

Is this better now?
Look at the example I gave in post #8. You concluded incorrectly that <1, 1> and <2, 2> were linearly independent merely because I showed that if c1<1, 1> + c2<2, 2> = 0, then ##c_1 = c_2 = 0## is a solution.

What is the definition of linear independence that you are using?
 
  • #18
Mark53 said:
Given that ##\{v_1,\ldots ,v_k\}## is linearly independent it means that ##c_1v_1+\ldots +c_kv_k=0## which implies that ##c_1=\ldots=c_k=0##
Almost. Cancel the "which". Linear independency means there is only the solution ##c_1= \ldots = c_k =0##, no others. To show it, we have to rule out the possibility of other solutions. In Mark's example ##2 \cdot (1,1) + (-1) \cdot (2,2) = 0## is another solution, so they cannot be linear independent. We could also write this equation as ##(2,2) = 2 \cdot (1,1)##, i.e. one vector is twice the other, which is a linear dependency.
E.g. ##(1,2)## and ##(2,3)## are linear independent. They cannot be combined to the zero vector, other than multiplying them with zeros. You might try it.
to show that ##\{Av_1,\ldots ,Av_k\}## is linearly independent we need to show that ##c_1Av_1+\ldots +c_kAv_k=0##
We need to show that ##c_1Av_1+\ldots +c_kAv_k=0## implies ##c_1= \ldots = c_k =0##, that it is forced. (see above)
Given that ##Null(A)=\{0\}## it means that the column vectors are linearly independent ...
Why?
... and given that we know that the ##V_i## are linearly independent then ##c_1=\ldots=c_k=0## which means that ##\{Av_1,\ldots ,Av_k\}## is also linearly independent
Yes, but some techniques would be helpful here in order to be convincing. Which properties do you apply when?
Is this better now?
Don't misunderstand me. I don't want to torture you, nor shall you please me. I only think it is important to gain some certainty with these concepts as there might be more ahead.
 

1. What is the definition of linear independence?

Linear independence refers to a set of vectors in a vector space, where none of the vectors can be written as a linear combination of the other vectors. In other words, no vector in the set can be expressed as a combination of the others using scalar multiplication and vector addition.

2. How do you prove linear independence of a set of vectors?

To prove linear independence, you can use the definition of linear independence and set up a system of equations with the vectors as unknowns. Then, solve the system of equations and show that the only solution is when all the coefficients are equal to 0. This demonstrates that the vectors are linearly independent.

3. What are the conditions for a set of vectors to form a basis in Rm?

The conditions for a set of vectors to form a basis in Rm are that the vectors must be linearly independent and span the entire vector space. This means that the vectors must be able to create any vector in Rm through linear combinations, and no vector in Rm can be expressed as a linear combination of the other vectors.

4. What is the importance of proving linear independence and conditions for basis?

Proving linear independence and conditions for basis is important because it allows us to determine if a set of vectors can be used to represent any vector in a vector space. This is crucial in various applications, such as solving systems of linear equations, finding eigenvalues and eigenvectors, and performing transformations in linear algebra.

5. Can a set of vectors be linearly independent but not form a basis in Rm?

Yes, it is possible for a set of vectors to be linearly independent but not form a basis in Rm. This can happen if the set of vectors does not span the entire vector space, meaning that there are vectors in Rm that cannot be represented by a linear combination of the set of vectors. In this case, the set of vectors is linearly independent but does not form a basis in Rm.

Similar threads

  • Precalculus Mathematics Homework Help
Replies
6
Views
1K
  • Precalculus Mathematics Homework Help
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
873
  • Precalculus Mathematics Homework Help
Replies
12
Views
2K
  • Precalculus Mathematics Homework Help
Replies
14
Views
5K
  • Precalculus Mathematics Homework Help
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
196
  • Calculus and Beyond Homework Help
Replies
0
Views
449
  • Precalculus Mathematics Homework Help
2
Replies
57
Views
3K
  • Precalculus Mathematics Homework Help
Replies
7
Views
2K
Back
Top