# Linear dependence of a set under linear transformation?

1. Sep 18, 2011

### poochie_d

Hi all,

Here is the problem:
If T: V -> W is a linear transformation and S is a linearly dependent subset of V, then prove that T(S) is linearly dependent.

Now, I know that the usual proof goes as follows:
Since S is linearly dependent, there are distinct vectors $v_1, ..., v_n$ in S and scalars $a_1, ..., a_n$ (not all zero) such that $\sum_{i=1}^n a_i v_i = 0.$

=> $\sum_{i=1}^n a_i T(v_i) = T(\sum_{i=1}^n a_i v_i) = T(0) = 0$

=> Since there are vectors $T(v_1), ..., T(v_n)$ in T(S) and scalars $a_1, ..., a_n$ (not all zero) such that they form a nontrivial representation of 0, it follows that T(S) is dependent.

What I am wondering is whether the above proof is still valid if some of the $v_i$'s take on the same value under T. In this case, wouldn't the proof be wrong, since you have to have distinct vectors to show that the set is dependent?

e.g. What if you have a situation where $S = \{v_1,v_2,v_3\}$ and $v_1 + v_2 - 2v_3 = 0,$ but $T(v_1) = T(v_2) = T(v_3) = w$ (say), so that
$0 = T(v_1) + T(v_2) - T(v_3) = w + w - 2w = 0w$? This doesn't prove that T(S) is dependent! (Or does it?)

Any help would be much appreciated. Thanks!

PS: I am posting this here since it is related to linear algebra, but maybe this is a homework-type question; please feel free to move it to a different forum if it doesn't belong here.

2. Sep 19, 2011

### Staff: Mentor

I moved this to the Homework & Coursework section, which is where it should go. Adding this note will bump the question.

3. Sep 21, 2011

### poochie_d

Oh, never mind; I figured it out. It turns out the statement I was trying to prove is not true...
e.g. If you have $T:\mathbb{R}^2 \to \mathbb{R}, \: T(x,y) = x+y,$ and $S = \{(2,0),(0,2),(1,1)\},$ then $S$ is linearly dependent but $T(S) = \{2\}$ is not.