Isomorphism and linear independence

I think I am missing a key info below. I have listed the problem statement, how I am approaching and why I think I am missing something.

Thanks

Asif
============

Problem statement:
Let T: U->V be an isomorphism. Let U1, U2,...,Un be linearly independent. Show that T(U1), T(U2),...,T(Un) is linearly independent in V.

Problem solution
1- In U, this is true: (lambda[1])(U[1]) + ... (lambda[n])(U[n]) = 0 as this is linearly independent and all lambdas are 0 (for linear independence)

2- Since it is an isomorphism, every vector in U uniquely maps to V.

3- Therefore V is linearly independent also.

4- v1 = (alpha[1])T(U[1])
vn = (alpha[n])T(U[n])
Since v is lineraly independent:
0 = alpha[1]T(U[1]) + alpha[2]T(U2)+....+alpha[n]T(U[n])

My question

a- Is step 3 a valid assumption
b- By saying equation in step 4, can I safely assume that T(u1),...,T(un) are linearly independent?

Thanks

Asif

CompuChip
Homework Helper
Actually, linear independence of v1, v2, ..., vn is equivalent to saying that if
a1 v1 + a2 v2 + ... + an vn = 0
then all ai must be zero. So in step 4 you actually assumed what you are proving.

I'd go about it like this: suppose that
b1 T(u1) + b2 T(u2) + ... + bn T(un) = 0
for some coefficients b1, ..., bn. You want to prove that these numbers are all zero. Now use what you know about a isomorphism. Especially note that it is linear. You will want to go back to a linear combination with just u1, ..., un, because you know that they are independent.

HallsofIvy
Homework Helper
I think I am missing a key info below. I have listed the problem statement, how I am approaching and why I think I am missing something.

Thanks

Asif
============

Problem statement:
Let T: U->V be an isomorphism. Let U1, U2,...,Un be linearly independent. Show that T(U1), T(U2),...,T(Un) is linearly independent in V.

Problem solution
1- In U, this is true: (lambda[1])(U[1]) + ... (lambda[n])(U[n]) = 0 as this is linearly independent and all lambdas are 0 (for linear independence)
Technically, what you have said is that if all the lambdas are 0, then that linear combination is 0- that's true of any set of vectors, independent or not. What you need to use is the other way: if, given that the linear combination is 0, then all lambdas must be 0.

2- Since it is an isomorphism, every vector in U uniquely maps to V.

3- Therefore V is linearly independent also.

4- v1 = (alpha[1])T(U[1])
vn = (alpha[n])T(U[n])
Since v is lineraly independent:
0 = alpha[1]T(U[1]) + alpha[2]T(U2)+....+alpha[n]T(U[n])

My question

a- Is step 3 a valid assumption
b- By saying equation in step 4, can I safely assume that T(u1),...,T(un) are linearly independent?

Thanks

Asif
Well, no, step 3 is NOT a "valid assumption"- it is what you are trying to prove! It might help to use a slightly different characterization of independence: it's not difficult to prove that a set of vectors is independent if and only if no one of them can be written as a linear combination of the others.

Suppose T(u1), T(u2), ... T(un) were NOT linearly independent Then one of them, say T(ui) is equal to a1T(u1)+ ... an T(un)= T(a1u1+ ... anun) where ui does NOT appear on the right hand sides. Then use the fact that T is one-to-one.

Another characterization of "linearly independent" is that the 0 vector can be written in only one way: if a1u1+ ... anun= 0, then a1= ...= an= 0. You could also use that.

So I assume you are saying that

- Assume there exists T(a[1]u[1])+...+T(a[n]u[n])) = 0

- Then because this is an isomorphic transformation and this transformation is linear, I get
T(a1u1+....anUn) = 0 or
(a1u1+...+anUn) = 0

- Now since U's is linerarly independent, it implies that all a's are 0 for all n.

- Therefore T(u) is linearly independent for all n.

So I assume you are saying that

- Assume there exists T(a[1]u[1])+...+T(a[n]u[n])) = 0

- Then because this is an isomorphic transformation and this transformation is linear, I get
T(a1u1+....anUn) = 0 or
(a1u1+...+anUn) = 0

- Now since U's is linerarly independent, it implies that all a's are 0 for all n.

- Therefore T(u) is linearly independent for all n.

No, you seem to be having a hard time, so this is how to do it. Make sure you can do this on your own!

Assume a_1T(u_1) + ... + a_nT(u_n) = 0 for some scalars a_1, ..., a_n, then by linearity of T we have,
T(a_1u_1 + ... +a_nu_n) = 0 = T(0). Now apply T^-1 to both sides, so
a_1u_1 + ... + a_nu_n = 0, but u_1,..., u_n are independent, so
a_1 = ... = a_n = 0, so we are done.

So we started with a_1T(u_1) + ... + a_nT(u_n) = 0, and showed all the a_i are zero, so by definition this means T(u_1), ..., T(u_n) are linearly independent.

HallsofIvy