Dependent vectors and their images

  • Thread starter Thread starter teleport
  • Start date Start date
  • Tags Tags
    Images Vectors
teleport
Messages
240
Reaction score
0

Homework Statement


Prove that if T is a linear transform, and vectors v1,..., vn are linearly dependent, then Tv1,...,Tvn are linearly dependent


Homework Equations





The Attempt at a Solution



I tried this:

Assume A1v1 + ... + Anvn = 0, where all Ai are scalars.

Taking the transform of both sides, we get

A1Tv1 + ... + AnTvn = 0. So there is the same relationship between these images of v's. So the Tv's are also dependent.

My problem is that if I did the same assumption with the v's li. independent, then I would get Tv's are also independent, which is not necessarily true.
 
Physics news on Phys.org
No, you wouldn't.

Post the proof you have in mind.
 
Last edited:
What happens if T(v) = 0 for some v=/=0?
 
I think it's quite easy...
You know that the v are dependent, that is: there exist numbers a_1, \cdots, a_n not all zero such that a_1 \vec v_1 + \cdots + a_n \vec v_n = 0. The question is, can you find numbers a_1', \cdots, a_n' - not all zero, such that a_1' T(\vec v_1) + \cdots + a_n' T(\vec v_n) = 0?

(Hint: try rewriting that left hand side. What do you know about linear transformations, e.g.: when are they zero?)
 
CompuChip said:
I think it's quite easy...
You know that the v are dependent, that is: there exist numbers a_1, \cdots, a_n not all zero such that a_1 \vec v_1 + \cdots + a_n \vec v_n = 0. The question is, can you find numbers a_1', \cdots, a_n' - not all zero, such that a_1' T(\vec v_1) + \cdots + a_n' T(\vec v_n) = 0?

(Hint: try rewriting that left hand side. What do you know about linear transformations, e.g.: when are they zero?)
That's not what he's asking. :wink:
 
Sorry, appearantly I read the post too fast (in particular, I think I missed the last line). Please disregard my post.
 
daniel_i_l: "What happens if T(v) = 0 for some v=/=0?".

Then all the T's are dependent right away.

morphism: "No, you wouldn't. Post the proof you have in mind."

For the proof of independence, I would do the same I did in the original post. However, since I assume A1v1 + ... + Anvn = 0, and what I do is just take the transform of that, then I'm just taking the transform of zero which we know is zero. So any result from that I assume does not imply anything concrete about the Ai. But isn't this the same for the dependence part?
 
Last edited:
CompuChip I don't see why your post is irrelevant to my question.

CompuChip: "(Hint: try rewriting that left hand side. What do you know about linear transformations, e.g.: when are they zero?)"

My problem with that is when I rewrite the left hand side in the form
'T( stuff ) = 0' I cannot say stuff = 0, since no one told me T is one-to-one.
 
Wow ok got it.

"I'm just taking the transform of zero which we know is zero. So any result from that I assume does not imply anything concrete about the Ai. But isn't this the same for the dependence part?"

Plz disregard that.

So for the independence part, the fact that all the Ai = 0, doesn't mean anything about the dependence of Tv's when I write A1Tv1 + ... + AnTvn = 0. Ha, in my face! Thanks.
 
  • #10
"So for the independence part, the fact that all the Ai = 0, doesn't mean anything about the dependence of Tv's when I write A1Tv1 + ... + AnTvn=0"

But woudn't that be the same for the dependence part. After all, imagine
Tv1 = (1,0) and Tv2 = (0,1), they are independent but if I multiply the first one by zero and do: kTv2 = 0 for any non-zero scalar k, then this might imply they are dependent when they are not. So my original "proof" might still be incorrect.
 
  • #11
It seems to me that you are overcomplicating this.

If the vectors v1, ..., vn are linearly dependent, then at least one of them can be written as a linear combination of the others. Try to express this and apply the linear transformation T.
 
  • #12
Radou this is exactly what I did in my original post. Now I see my reasoning in my previous post was impossible:

"imagine Tv1 = (1,0) and Tv2 = (0,1), they are independent but if I multiply the first one by zero and do: kTv2 = 0 for any non-zero scalar k"

Plz throw that in the garbage:blushing: Thanks
 
  • #13
teleport said:
CompuChip I don't see why your post is irrelevant to my question.

Because I thought you wanted to prove that: If the vectors are independent, then so are their images.
But that is what you already had.

Anyway,
My problem is that if I did the same assumption with the v's li. independent, then I would get Tv's are also independent, which is not necessarily true.
Following the same reasoning as in the proof of what I originally though was the question :smile::
Suppose the \vec v_i[/tex] are linearly independent. Then a_1 \vec v_1 + \cdots a_n \vec v_n = 0 \implies a_i = 0 for <i>i</i> = 1, 2, ..., <i>n</i>. Now suppose c_1 T(\vec v_1) + \cdots + c_n T(\vec v_n) = 0. Then since <i>T</i> is a linear transformation, T(c_1 \vec v_1 + \cdots + c_n \vec v_n) = 0. Now <i>if</i> T was invertible, it&#039;s kernel would be {0} and we could conclude that c_1 \vec v_1 + \cdots + c_n \vec v_n = 0 and hence c_i = 0 for all <i>i</i>, and the assertion would be true.
 
  • #14
CompuChip said:
Because I thought you wanted to prove that: If the vectors are independent, then so are their images.
But that is what you already had..

No. I proved that their images were dependent given dependent vectors in the domain of T, not the contrary. Besides, their is no way to prove independence of the images given independent vectors, except for the single case you have shown (which I referred to in a previous post that T is not given to be one-to-one). Also, you don't need T to be invertible... only that ker T = {0} <=> T is injective <=> T is one-to-one.
 
Last edited:
  • #15
Assume the vectors v_{1}, \cdots , v_{n} are linearly dependent, so there exists at least one scalar \alpha_{j} \neq 0 such that \sum_{i=1}^n \alpha_{i} v_{i}=0, and hence, v_{j}= -\frac{1}{\alpha_{j}}\sum_{i=1}^n \alpha_{i} v_{i}, where i \neq j. Now apply the linear transformation T to the equation.
 
  • #16
radou said:
Assume the vectors v_{1}, \cdots , v_{n} are linearly dependent, so there exists at least one scalar \alpha_{j} \neq 0 such that \sum_{i=1}^n \alpha_{i} v_{i}=0, and hence, v_{j}= -\frac{1}{\alpha_{j}}\sum_{i=1}^n \alpha_{i} v_{i}, where i \neq j. Now apply the linear transformation T to the equation.

This is saying the same thing my original post said. Just take one of my AiTv, where Ai =/= 0, to the side with 0. Thanks though.
 

Similar threads

Back
Top