Linear dependence of a set under linear transformation?

poochie_d
Messages
17
Reaction score
0
Hi all,

Here is the problem:
If T: V -> W is a linear transformation and S is a linearly dependent subset of V, then prove that T(S) is linearly dependent.

Now, I know that the usual proof goes as follows:
Since S is linearly dependent, there are distinct vectors v_1, ..., v_n in S and scalars a_1, ..., a_n (not all zero) such that \sum_{i=1}^n a_i v_i = 0.

=> \sum_{i=1}^n a_i T(v_i) = T(\sum_{i=1}^n a_i v_i) = T(0) = 0

=> Since there are vectors T(v_1), ..., T(v_n) in T(S) and scalars a_1, ..., a_n (not all zero) such that they form a nontrivial representation of 0, it follows that T(S) is dependent.

What I am wondering is whether the above proof is still valid if some of the v_i's take on the same value under T. In this case, wouldn't the proof be wrong, since you have to have distinct vectors to show that the set is dependent?

e.g. What if you have a situation where S = \{v_1,v_2,v_3\} and v_1 + v_2 - 2v_3 = 0, but T(v_1) = T(v_2) = T(v_3) = w (say), so that
0 = T(v_1) + T(v_2) - T(v_3) = w + w - 2w = 0w? This doesn't prove that T(S) is dependent! (Or does it?)

Any help would be much appreciated. Thanks!


PS: I am posting this here since it is related to linear algebra, but maybe this is a homework-type question; please feel free to move it to a different forum if it doesn't belong here.
 
Physics news on Phys.org
I moved this to the Homework & Coursework section, which is where it should go. Adding this note will bump the question.
 
Oh, never mind; I figured it out. It turns out the statement I was trying to prove is not true...
e.g. If you have T:\mathbb{R}^2 \to \mathbb{R}, \: T(x,y) = x+y, and S = \{(2,0),(0,2),(1,1)\}, then S is linearly dependent but T(S) = \{2\} is not.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top