Proving Linear Independence of Av1, ..., Avk and Conditions for Basis in Rm

AI Thread Summary
The discussion focuses on proving the linear independence of the set {Av1, ..., Avk} given that {v1, ..., vk} is linearly independent and that the null space of matrix A is {0}. It is established that if A is injective (implying no free variables), then the transformation of linearly independent vectors through A will also yield a linearly independent set. The participants emphasize the need to demonstrate that the only solution to the equation c1Av1 + ... + ckAvk = 0 is c1 = ... = ck = 0. Additionally, there is a query about the conditions under which {Av1, ..., Avk} can form a basis for Rm when {v1, ..., vk} is a basis for Rn, highlighting the relationship between the dimensions m and n. Overall, the conversation aims to clarify the implications of linear independence in the context of matrix transformations.
Mark53
Messages
93
Reaction score
0

Homework Statement


[/B]
1. Suppose {v1, . . . , vk} is a linearly independent set of vectors in Rn and suppose A is an m × n matrix such that Nul A = {0}.
(a) Prove that {Av1, . . . , Avk} is linearly independent.
(b) Suppose that {v1, . . . , vk} is actually a basis for Rn. Under what conditions on m and n will {Av1, . . . , Avk} be a basis for Rm?

The Attempt at a Solution



a)
we know that c1V1+...+CnVn=0 as it is linearly independent

suppose that

C1AV1+...+CnAVn=0 and that A is an invertible matrix since the null space is 0 which means we can multiply both sides by A^-1 which gives c1V1+...+CnVn=0 which means that there is a trivial solution and that it is linearly independent

Is this correct?

b)
Im unsure on how to get started on this question

Thanks for any help
 
Physics news on Phys.org
Mark53 said:

Homework Statement


[/B]
1. Suppose {v1, . . . , vk} is a linearly independent set of vectors in Rn and suppose A is an m × n matrix such that Nul A = {0}.
(a) Prove that {Av1, . . . , Avk} is linearly independent.
(b) Suppose that {v1, . . . , vk} is actually a basis for Rn. Under what conditions on m and n will {Av1, . . . , Avk} be a basis for Rm?

The Attempt at a Solution



a)
we know that c1V1+...+CnVn=0 as it is linearly independent

suppose that

C1AV1+...+CnAVn=0 and that A is an invertible matrix since the null space is 0 which means we can multiply both sides by A^-1 which gives c1V1+...+CnVn=0 which means that there is a trivial solution and that it is linearly independent

Is this correct?

b)
Im unsure on how to get started on this question

Thanks for any help
Is ##m=k##? And how can ##k## vectors be a basis of an ##n-##dimensional vector space ##\mathbb{R}^n##?
The definition of linear independence goes as follows:
##\{v_1,\ldots , v_k\}## are linear independent if ##(c_1v_1+\ldots+c_kv_k=0 \Longrightarrow c_1=\ldots=c_k=0)##
So you have to show this implication, the equation ##c_1v_1+\ldots+c_kv_k=0## alone doesn't mean anything.
Why should ##A## be invertible? Is ##m=n##? Use the linearity of ##A## instead.
 
fresh_42 said:
Is ##m=k##? And how can ##k## vectors be a basis of an ##n-##dimensional vector space ##\mathbb{R}^n##?
The definition of linear independence goes as follows:
##\{v_1,\ldots , v_k\}## are linear independent if ##(c_1v_1+\ldots+c_kv_k=0 \Longrightarrow c_1=\ldots=c_k=0)##
So you have to show this implication, the equation ##c_1v_1+\ldots+c_kv_k=0## alone doesn't mean anything.
Why should ##A## be invertible? Is ##m=n##? Use the linearity of ##A## instead.
what does the Linearity of A mean?
 
Linearity of a mapping, and ##A## is a linear mapping ##A\, : \,\mathbb{R}^n \longrightarrow \mathbb{R}^m##, that is ##A(v+w) = Av + Aw## and ##A(cv)=cA(v)## for ##c \in \mathbb{R}##.
 
Mark53 said:
we know that c1V1+...+CnVn=0 as it is linearly independent
You can still write exactly the same equation if ##\{v_1, v_2, \dots, v_n\}## is a linearly dependent set. So what is it that distinguishes a linearly independent set from a linearly dependent set?

Mark53 said:
suppose that C1AV1+...+CnAVn=0 and that A is an invertible matrix
It's given that A is m x n, so you can't assume that A is invertible.
 
Mark44 said:
You can still write exactly the same equation if ##\{v_1, v_2, \dots, v_n\}## is a linearly dependent set. So what is it that distinguishes a linearly independent set from a linearly dependent set?

it is linearly independent if ##\{c_1, c_2, \dots, c_n=0\}##
Mark44 said:
It's given that A is m x n, so you can't assume that A is invertible.

Then how would i go about showing that it is independent?
 
Mark53 said:
it is linearly independent if ##\{c_1, c_2, \dots, c_n=0\}##

Then how would i go about showing that it is independent?

We cannot answer that without giving away the whole solution.
 
Mark53 said:
it is linearly independent if ##\{c_1, c_2, \dots, c_n=0\}##
Consider these vectors in ##\mathbb{R}^2##: ##v_1 = <1, 1>, v_2 = <2, 2>##
Suppose that ##c_1v_1 + c_2v_2 = 0##. Clearly ##c_1 = c_2 = 0## is a solution. Does it follow that the two vectors are linearly independent?
 
Mark44 said:
Consider these vectors in ##\mathbb{R}^2##: ##v_1 = <1, 1>, v_2 = <2, 2>##
Suppose that ##c_1v_1 + c_2v_2 = 0##. Clearly ##c_1 = c_2 = 0## is a solution. Does it follow that the two vectors are linearly independent?
yes as there is only the trivial solution
 
  • #10
Mark53 said:
yes as there is only the trivial solution
And what about ##2 \cdot v_1 + (-1)\cdot v_2##?
 
  • #11
fresh_42 said:
And what about ##2 \cdot v_1 + (-1)\cdot v_2##?
then adding the vectors would be 0 but this shouldn't effect the independence
 
  • #12
You should really repeat the concept of linearity, of vectors as well as of mappings. Two vectors are linear independent, if they don't point to the same direction. Does this hold for ##(1,1)## and ##(2,2)##? Three vectors are linear independent, if they are not all on the same line or in the same plane. The formal notation of this is
fresh_42 said:
The definition of linear independence goes as follows:
##\{v_1,\ldots , v_k\}## are linear independent if ##(c_1v_1+\ldots+c_kv_k=0 \Longrightarrow c_1=\ldots=c_k=0)##
So you have to show this implication, the equation ##c_1v_1+\ldots+c_kv_k=0## alone doesn't mean anything.
So the implication is the crucial point. Choosing ##c_i=0## is always a solution. What sense would it make to call vectors linear independent then? The point is, it has to be the only solution, for else we could always write a ##v_i= -\frac{c_1}{c_i}v_1 - \ldots -\frac{c_n}{c_i}v_n## which is clearly a linear dependence.
 
  • #13
Mark44 said:
Consider these vectors in ##\mathbb{R}^2##: ##v_1 = <1, 1>, v_2 = <2, 2>##
Suppose that ##c_1v_1 + c_2v_2 = 0##. Clearly ##c_1 = c_2 = 0## is a solution. Does it follow that the two vectors are linearly independent?

Mark53 said:
yes as there is only the trivial solution
No. ##c_1 = c_2 = 0## is a solution, but it is far from being the only solution. As fresh_42 explains, my two vectors are NOT linearly independent.
 
  • #14
Mark44 said:
No. ##c_1 = c_2 = 0## is a solution, but it is far from being the only solution. As fresh_42 explains, my two vectors are NOT linearly independent.

If Null A = {0} that means that A would be Linearly independent as there are no free variables and it has a trivial solution

Given that {v1,...;,vk} is linearly independent

Then {AV1,...AVK} would also be linearly independent as the matrix A is linearly independent

Would this be correct now or am I missing something?
 
  • #15
You cannot say "##A## is linear independent" in this context. What does it mean? You might say that the column or row vectors are linear independent but not the matrix as a whole. Actually ##Null(A)=\{0\}## means, that ##A## is injective, or "into", or an embedding, in your case ##A\, : \,\mathbb{R}^n \hookrightarrow \mathbb{R}^m##. So the column vectors of ##A## are linear independent (whereas the row vectors are not!), but do you know why? Where are no free variables? What are the variables? A trivial solution ##A\cdot \vec{0} = \vec{0}## always exists, so this cannot be a criteria.

Start at the beginning instead. You want to show, that ##\{Av_1,\ldots ,Av_k\}## are linear independent vectors, given ##\{v_1,\ldots ,v_k\}## are linear independent. What do you have to show then? Look up the definition of linear independence in #2.
In the next steps, you should use the linearity of ##A## (see definition in #4), then the condition ##Null(A)=\{0\}## and at last the fact, that the ##v_i## are linear independent. Try to proceed along these four steps.
 
  • #16
fresh_42 said:
You cannot say "##A## is linear independent" in this context. What does it mean? You might say that the column or row vectors are linear independent but not the matrix as a whole. Actually ##Null(A)=\{0\}## means, that ##A## is injective, or "into", or an embedding, in your case ##A\, : \,\mathbb{R}^n \hookrightarrow \mathbb{R}^m##. So the column vectors of ##A## are linear independent (whereas the row vectors are not!), but do you know why? Where are no free variables? What are the variables? A trivial solution ##A\cdot \vec{0} = \vec{0}## always exists, so this cannot be a criteria.

Start at the beginning instead. You want to show, that ##\{Av_1,\ldots ,Av_k\}## are linear independent vectors, given ##\{v_1,\ldots ,v_k\}## are linear independent. What do you have to show then? Look up the definition of linear independence in #2.
In the next steps, you should use the linearity of ##A## (see definition in #4), then the condition ##Null(A)=\{0\}## and at last the fact, that the ##v_i## are linear independent. Try to proceed along these four steps.

Given that ##\{v_1,\ldots ,v_k\}## is linearly independent it means that ##c_1v_1+\ldots +c_kv_k=0## which implies that ##c_1=\ldots=c_k=0##

to show that ##\{Av_1,\ldots ,Av_k\}## is linearly independent we need to show that ##c_1Av_1+\ldots +c_kAv_k=0##

Given that ##Null(A)=\{0\}## it means that the column vectors are linearly independent and given that we know that the ##V_i## are linearly independent then ##c_1=\ldots=c_k=0## which means that ##\{Av_1,\ldots ,Av_k\}## is also linearly independent

Is this better now?
 
  • #17
Mark53 said:
Given that ##\{v_1,\ldots ,v_k\}## is linearly independent it means that ##c_1v_1+\ldots +c_kv_k=0## which implies that ##c_1=\ldots=c_k=0##
As already stated, even if ##\{v_1,\ldots ,v_k\}## is linearly dependent, the equation ##c_1v_1+\ldots +c_kv_k=0## still has a solution of ##c_1=\ldots=c_k=0##. There's a subtlety here that I don't think you are getting.
Mark53 said:
to show that ##\{Av_1,\ldots ,Av_k\}## is linearly independent we need to show that ##c_1Av_1+\ldots +c_kAv_k=0##
No. You can always set up this equation, whether or not ##\{Av_1, \dots, Av_k \} is a linearly independent set.
Mark53 said:
Given that ##Null(A)=\{0\}## it means that the column vectors are linearly independent and given that we know that the ##V_i## are linearly independent then ##c_1=\ldots=c_k=0## which means that ##\{Av_1,\ldots ,Av_k\}## is also linearly independent

Is this better now?
Look at the example I gave in post #8. You concluded incorrectly that <1, 1> and <2, 2> were linearly independent merely because I showed that if c1<1, 1> + c2<2, 2> = 0, then ##c_1 = c_2 = 0## is a solution.

What is the definition of linear independence that you are using?
 
  • #18
Mark53 said:
Given that ##\{v_1,\ldots ,v_k\}## is linearly independent it means that ##c_1v_1+\ldots +c_kv_k=0## which implies that ##c_1=\ldots=c_k=0##
Almost. Cancel the "which". Linear independency means there is only the solution ##c_1= \ldots = c_k =0##, no others. To show it, we have to rule out the possibility of other solutions. In Mark's example ##2 \cdot (1,1) + (-1) \cdot (2,2) = 0## is another solution, so they cannot be linear independent. We could also write this equation as ##(2,2) = 2 \cdot (1,1)##, i.e. one vector is twice the other, which is a linear dependency.
E.g. ##(1,2)## and ##(2,3)## are linear independent. They cannot be combined to the zero vector, other than multiplying them with zeros. You might try it.
to show that ##\{Av_1,\ldots ,Av_k\}## is linearly independent we need to show that ##c_1Av_1+\ldots +c_kAv_k=0##
We need to show that ##c_1Av_1+\ldots +c_kAv_k=0## implies ##c_1= \ldots = c_k =0##, that it is forced. (see above)
Given that ##Null(A)=\{0\}## it means that the column vectors are linearly independent ...
Why?
... and given that we know that the ##V_i## are linearly independent then ##c_1=\ldots=c_k=0## which means that ##\{Av_1,\ldots ,Av_k\}## is also linearly independent
Yes, but some techniques would be helpful here in order to be convincing. Which properties do you apply when?
Is this better now?
Don't misunderstand me. I don't want to torture you, nor shall you please me. I only think it is important to gain some certainty with these concepts as there might be more ahead.
 
Back
Top