Are Linear Transformations of Linearly Dependent Sets Also Linearly Dependent?

  • Thread starter Thread starter Mola
  • Start date Start date
  • Tags Tags
    Linearly Sets
Mola
Messages
22
Reaction score
0
If A is a 3x3 Matrix and {v1, v2, v3} is a linearly dependent set of vectors in R^3, then {Av1, Av2, Av3} is also a linearly dependent set?

Is this true? Can someone please explain why or why not??

What I think: I think it is true because I read that a linear transformation preserves the operations of vector addition and scalar multiplication.
 
Physics news on Phys.org
If v1,v2,v3 are linearly dependent you can find constants a1, a2, a3 not all 0 such that
a_1v_1 + a_2v_2 + a_3v_3 = 0
Now left-multiplying this by A you get:
A(a_1v_1+a_2v_2+a_3v_3) = A0=0
Now use your rules for matrix arithmetic to derive:
a_1(Av_1)+a_2(Av_2)+a_3(Av_3)=0
(HINT: Ak = kA for constants k, and A(v+w) = Av+Aw for vectors v, w where the expression makes sense).
 
That makes sense. So if we have a1(Av1) + a2(Av2) + a3(Av3) = 0, then at least one of the constants could be zero and that will definitely result to a linearly dependent set.
Thanks.

That leads me to a related theory: Let's assume we are talking about {v1, v2, v3} being a linearly INDEPENDENT set now. If we multiply the vectors by the matrix A, how does it affect the independece? Would it make a differerence if the matrix A is invertible?
 
Mola said:
That leads me to a related theory: Let's assume we are talking about {v1, v2, v3} being a linearly INDEPENDENT set now. If we multiply the vectors by the matrix A, how does it affect the independece? Would it make a differerence if the matrix A is invertible?

This is actually a quite interesting little question (well in my opinion anyway). First for fixed A, v1,v2,v3 note that if we take the contrapositive of your initial result we get:
If Av1, Av2, Av3 are linearly independent, then v1,v2,v3 are linearly independent.
so for linear independence it goes backwards. For an arbitrary matrix A we can not prove your new statement since we can just let A be the 0 matrix. However if A is invertible, then we can just go backwards by noting that if,
a_1Av_1+a_2Av_2+a_3Av_3 = 0
Then we can left-multiply by A^{-1} to get,
a_1v_1+a_2v_2+a_3v_3 = 0
so if v1,v2,v3 are linearly independent and A is invertible, then Av1, Av2, Av3 are linearly independent.
 
Thanks rasmhop... I did think "A" being an invertible matrix could make a difference but I didn't know how to prove it.
That was a very good help from you.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
Back
Top