Dependent vectors and their images

1. Jul 19, 2007

teleport

1. The problem statement, all variables and given/known data
Prove that if T is a linear transform, and vectors v1,..., vn are linearly dependent, then Tv1,...,Tvn are linearly dependent

2. Relevant equations

3. The attempt at a solution

I tried this:

Assume A1v1 + ... + Anvn = 0, where all Ai are scalars.

Taking the transform of both sides, we get

A1Tv1 + ... + AnTvn = 0. So there is the same relationship between these images of v's. So the Tv's are also dependent.

My problem is that if I did the same assumption with the v's li. independent, then I would get Tv's are also independent, which is not necessarily true.

2. Jul 20, 2007

morphism

No, you wouldn't.

Post the proof you have in mind.

Last edited: Jul 20, 2007
3. Jul 20, 2007

daniel_i_l

What happens if T(v) = 0 for some v=/=0?

4. Jul 20, 2007

CompuChip

I think it's quite easy...
You know that the v are dependent, that is: there exist numbers $$a_1, \cdots, a_n$$ not all zero such that $$a_1 \vec v_1 + \cdots + a_n \vec v_n = 0$$. The question is, can you find numbers $$a_1', \cdots, a_n'$$ - not all zero, such that $$a_1' T(\vec v_1) + \cdots + a_n' T(\vec v_n) = 0$$?

(Hint: try rewriting that left hand side. What do you know about linear transformations, e.g.: when are they zero?)

5. Jul 20, 2007

morphism

That's not what he's asking.

6. Jul 20, 2007

CompuChip

Sorry, appearantly I read the post too fast (in particular, I think I missed the last line). Please disregard my post.

7. Jul 20, 2007

teleport

daniel_i_l: "What happens if T(v) = 0 for some v=/=0?".

Then all the T's are dependent right away.

morphism: "No, you wouldn't. Post the proof you have in mind."

For the proof of independence, I would do the same I did in the original post. However, since I assume A1v1 + ... + Anvn = 0, and what I do is just take the transform of that, then I'm just taking the transform of zero which we know is zero. So any result from that I assume does not imply anything concrete about the Ai. But isn't this the same for the dependence part?

Last edited: Jul 20, 2007
8. Jul 20, 2007

teleport

CompuChip I don't see why your post is irrelevant to my question.

CompuChip: "(Hint: try rewriting that left hand side. What do you know about linear transformations, e.g.: when are they zero?)"

My problem with that is when I rewrite the left hand side in the form
'T( stuff ) = 0' I cannot say stuff = 0, since no one told me T is one-to-one.

9. Jul 20, 2007

teleport

Wow ok got it.

"I'm just taking the transform of zero which we know is zero. So any result from that I assume does not imply anything concrete about the Ai. But isn't this the same for the dependence part?"

Plz disregard that.

So for the independence part, the fact that all the Ai = 0, doesn't mean anything about the dependence of Tv's when I write A1Tv1 + ... + AnTvn = 0. Ha, in my face! Thanks.

10. Jul 20, 2007

teleport

"So for the independence part, the fact that all the Ai = 0, doesn't mean anything about the dependence of Tv's when I write A1Tv1 + ... + AnTvn=0"

But woudn't that be the same for the dependence part. After all, imagine
Tv1 = (1,0) and Tv2 = (0,1), they are independent but if I multiply the first one by zero and do: kTv2 = 0 for any non-zero scalar k, then this might imply they are dependent when they are not. So my original "proof" might still be incorrect.

11. Jul 20, 2007

radou

It seems to me that you are overcomplicating this.

If the vectors v1, ..., vn are linearly dependent, then at least one of them can be written as a linear combination of the others. Try to express this and apply the linear transformation T.

12. Jul 20, 2007

teleport

Radou this is exactly what I did in my original post. Now I see my reasoning in my previous post was impossible:

"imagine Tv1 = (1,0) and Tv2 = (0,1), they are independent but if I multiply the first one by zero and do: kTv2 = 0 for any non-zero scalar k"

Plz throw that in the garbage Thanks

13. Jul 20, 2007

CompuChip

Because I thought you wanted to prove that: If the vectors are independent, then so are their images.
But that is what you already had.

Anyway,
Following the same reasoning as in the proof of what I originally though was the question :
Suppose the $\vec v_i[/tex] are linearly independent. Then $$a_1 \vec v_1 + \cdots a_n \vec v_n = 0 \implies a_i = 0$$ for i = 1, 2, ..., n. Now suppose $$c_1 T(\vec v_1) + \cdots + c_n T(\vec v_n) = 0$$. Then since T is a linear transformation, $$T(c_1 \vec v_1 + \cdots + c_n \vec v_n) = 0$$. Now if T was invertible, it's kernel would be {0} and we could conclude that $$c_1 \vec v_1 + \cdots + c_n \vec v_n = 0$$ and hence $$c_i = 0$$ for all i, and the assertion would be true. 14. Jul 20, 2007 teleport No. I proved that their images were dependent given dependent vectors in the domain of T, not the contrary. Besides, their is no way to prove independence of the images given independent vectors, except for the single case you have shown (which I refered to in a previous post that T is not given to be one-to-one). Also, you don't need T to be invertible... only that ker T = {0} <=> T is injective <=> T is one-to-one. Last edited: Jul 20, 2007 15. Jul 20, 2007 radou Assume the vectors [itex]v_{1}, \cdots , v_{n}$ are linearly dependent, so there exists at least one scalar $\alpha_{j} \neq 0$ such that $\sum_{i=1}^n \alpha_{i} v_{i}=0$, and hence, $v_{j}= -\frac{1}{\alpha_{j}}\sum_{i=1}^n \alpha_{i} v_{i}$, where $i \neq j$. Now apply the linear transformation T to the equation.

16. Jul 20, 2007

teleport

This is saying the same thing my original post said. Just take one of my AiTv, where Ai =/= 0, to the side with 0. Thanks though.

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?