Linear Algebra: Linear Transformation and Linear Independence

Since there existed a nonzero coefficient,that implies that{w1, w2, ... , wk} is Linearly DEPENDENTwhich contradicts the statement that {w1, w2, ... , wk} is Linearly Independentthus the original statement is trueis this correct?In summary, the given problem states that if S = {v1, v2, ..., vk} is chosen such that T(vi) = wi for i = 1, 2, ..., k, and {w1, w2, ..., wk} is a linearly independent subset of R(T), then S is also linearly independent. To prove this, we can use an indirect proof by
  • #1
b0it0i
36
0

Homework Statement


Let V and W be vector spaces, Let T: V --> W be linear, and let {w1, w2,..., wk} be linearly independent subset of R(T). Prove that if S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k, then S is linearly independent

Homework Equations


The Attempt at a Solution


I have no idea where to start with this proof, I've been looking over past theorems to gather information, but I'm not sure how to connect them to show that

S is linearly independent. Can anyone give a hint or suggestion

Attempt 1:

Since T is linear, we know that
T(x + y) = T(x) + T(y)
T(ax) = aT(x)

Since

{w1, w2,..., wk} is linearly independent subset of R(T)

aw1 + bw2 + ... + cwk = 0
a = b = c = 0T(v1 +...+ vk) = T(v1) +...+ T(vk) = w1 +...+ wk (and since linearly independent)
0w1 +...+ 0wk = 0,

so then i worked backwards

0 = 0w1 +...+ 0wk = 0T(v1) +...+ 0T(vk) = 0T(v1 +...+ vk) = T(0v1 +...+ 0vk)

and hence 0v1 + ...+ 0vk = 0

therefore S = {v1,v2,...vk} is linearly independent?
Attempt 2:

since R(T) is a subspace of W
and {w1, w2,..., wk} is a subset of R(T)
then span{w1, w2,..., wk} is a subset of R(T)

I was hoping to show that S is a basis, and hence S is linearly independent
but i couldn't get to thatT(vi) = wi

does that mean each vector, wi, can be written as a unique linear combination of vi

and hence vi is a basis? thus S is linearly independent?

i don't think my methods are correct

any suggestions would be helpful, thanks alot
NEW ATTEMPT:

Alrite, I think I got it

but when you said use an

"Indirect Proof", I used proof by contradiction and obtained thisProblem:
Let V and W be vector spaces, Let T: V --> W be linear, and let {w1, w2,..., wk} be linearly independent subset of R(T). Prove that if S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k, then S is linearly independent

S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k --> S is linearly independent

I assumed the negation

S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k and S is Linearly Dependent

Since S is linearly dependent

a1v1 + a2v2 + ... akvk = 0
such that there exists a nonzero coefficient

then as you suggested, by taking T of both sides

T(a1v1 + a2v2 + ... akvk) = T(0)

and since T is linear

a1 T(v1) + ... + ak T(vk) = 0 [since T(0)=0 ]

then our other assumption, T(vi)=wi

implies that

a1w1 +...+ akwk = 0

Since there existed a nonzero coefficient,
that implies that

{w1, w2, ... , wk} is Linearly DEPENDENT

which contradicts the statement that {w1, w2, ... , wk} is Linearly Independent

thus the original statement is true

is this correct?
 
Last edited:
Physics news on Phys.org
  • #2
b0it0i said:

Homework Statement


Let V and W be vector spaces, Let T: V --> W be linear, and let {w1, w2,..., wk} be linearly independent subset of R(T). Prove that if S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k, then S is linearly independent


Homework Equations





The Attempt at a Solution


I have no idea where to start with this proof, I've been looking over past theorems to gather information, but I'm not sure how to connect them to show that

S is linearly independent. Can anyone give a hint or suggestion

Attempt 1:

Since T is linear, we know that
T(x + y) = T(x) + T(y)
T(ax) = aT(x)

Since

{w1, w2,..., wk} is linearly independent subset of R(T)

aw1 + bw2 + ... + cwk = 0
a = b = c = 0


T(v1 +...+ vk) = T(v1) +...+ T(vk) = w1 +...+ wk (and since linearly independent)
0w1 +...+ 0wk = 0,

so then i worked backwards

0 = 0w1 +...+ 0wk = 0T(v1) +...+ 0T(vk) = 0T(v1 +...+ vk) = T(0v1 +...+ 0vk)

and hence 0v1 + ...+ 0vk = 0

therefore S = {v1,v2,...vk} is linearly independent?
No, you have shown that 0v1+ 0v2+ ...+ 0vk= 0 but we knew that anyway!
Try an "indirect proof". If v1, v2, ..., vk are NOT independent then there exist
a1, a2,..., ak, NOT all 0, such that a1v1+ a2v2+ ...+ akvk= 0. What happens if you take T of both sides of that?


Attempt 2:

since R(T) is a subspace of W
and {w1, w2,..., wk} is a subset of R(T)
then span{w1, w2,..., wk} is a subset of R(T)

I was hoping to show that S is a basis, and hence S is linearly independent
but i couldn't get to that


T(vi) = wi

does that mean each vector, wi, can be written as a unique linear combination of vi
No, it doesn't. wi may not even be in vector space V.

and hence vi is a basis? thus S is linearly independent?

i don't think my methods are correct

any suggestions would be helpful, thanks alot
There is nothing in here to suggest that either {w1, w2,..., wk} or {w1, w2,...,wk} is a basis only that they are independent. Try the way I suggested.
 
  • #3
Alrite, I think I got it

but when you said use an

"Indirect Proof", I used proof by contradiction and obtained thisProblem:
Let V and W be vector spaces, Let T: V --> W be linear, and let {w1, w2,..., wk} be linearly independent subset of R(T). Prove that if S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k, then S is linearly independent

S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k --> S is linearly independent

I assumed the negation

S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k and S is Linearly Dependent

Since S is linearly dependent

a1v1 + a2v2 + ... akvk = 0
such that there exists a nonzero coefficient

then as you suggested, by taking T of both sides

T(a1v1 + a2v2 + ... akvk) = T(0)

and since T is linear

a1 T(v1) + ... + ak T(vk) = 0 [since T(0)=0 ]

then our other assumption, T(vi)=wi

implies that

a1w1 +...+ akwk = 0

Since there existed a nonzero coefficient,
that implies that

{w1, w2, ... , wk} is Linearly DEPENDENT

which contradicts the statement that {w1, w2, ... , wk} is Linearly Independent

thus the original statement is true

is this correct?
 
  • #4
I'm a beginner to proofs myself, but your proof looks correct to me. I don't think you needed to use contradiction.

Here's my work, although I'm not 100% sure its correct either:

suppose f(c1v1 + c2v2 + ... ckvk) = c1w1 + c2w2 + ... ckwk = 0
Then, since (w1, ..., wk) is linearly independent, c1=c2=...ck=0

As T is linear, f(0) = 0 since W is a vector space?

Thus, c1v1 + c2v2 + ... ckvk = 0. From above, c1=c2=...ck=0. Thus v1, v2, ...vk is linearly independent
 
  • #5
proton said:
I'm a beginner to proofs myself, but your proof looks correct to me. I don't think you needed to use contradiction.

Here's my work, although I'm not 100% sure its correct either:

suppose f(c1v1 + c2v2 + ... ckvk) = c1w1 + c2w2 + ... ckwk = 0
Then, since (w1, ..., wk) is linearly independent, c1=c2=...ck=0

As T is linear, f(0) = 0 since W is a vector space?

Thus, c1v1 + c2v2 + ... ckvk = 0. From above, c1=c2=...ck=0. Thus v1, v2, ...vk is linearly independent
first, if you are given a linear tranformation, T, don't start talking about "f"!

Second, while it is certainly true that T(0)= 0, it is NOT always true that if T(v)= 0, then we must have v= 0! And that's the way you need.
 
  • #6
proton said:
I'm a beginner to proofs myself, but your proof looks correct to me. I don't think you needed to use contradiction.

Here's my work, although I'm not 100% sure its correct either:

suppose f(c1v1 + c2v2 + ... ckvk) = c1w1 + c2w2 + ... ckwk = 0
Then, since (w1, ..., wk) is linearly independent, c1=c2=...ck=0

As T is linear, f(0) = 0 since W is a vector space?

Thus, c1v1 + c2v2 + ... ckvk = 0. From above, c1=c2=...ck=0. Thus v1, v2, ...vk is linearly independent
first, if you are given a linear tranformation, T, don't start talking about "f"!

Second, while it is certainly true that T(0)= 0, it is NOT always true that if T(v)= 0, then we must have v= 0! The kernel of a linear transformation is not necessairily {0}. And that's the way you need.
 
  • #7
ok let me try it again:

suppose c1v1 + c2v2 + ... ckvk = 0. Taking T of both sides, we obtain
T(c1v1 + c2v2 + ... ckvk) = T(0) = 0 [since T(0) = 0 because T is linear?] Then
c1w1 + c2w2 + ... ckwk = 0
Then, since (w1, ..., wk) is linearly independent, c1=c2=...ck=0

Thus, for c1v1 + c2v2 + ... ckvk = 0, with c1=c2=...ck=0, S = {v1,v2,...vk} must be a linearly independent set
 
  • #8
proton said:
[since T(0) = 0 because T is linear?]


If you don't understand whty T(0) is 0 then you should try to prove it.
 
  • #9
proton said:
ok let me try it again:

suppose c1v1 + c2v2 + ... ckvk = 0. Taking T of both sides, we obtain
T(c1v1 + c2v2 + ... ckvk) = T(0) = 0 [since T(0) = 0 because T is linear?] Then
c1w1 + c2w2 + ... ckwk = 0
Then, since (w1, ..., wk) is linearly independent, c1=c2=...ck=0

Thus, for c1v1 + c2v2 + ... ckvk = 0, with c1=c2=...ck=0, S = {v1,v2,...vk} must be a linearly independent set

Looks correct.
 

1. What is a linear transformation in linear algebra?

A linear transformation is a function that maps one vector space to another while preserving the algebraic structure. In other words, it is a transformation that maintains the operations of addition and scalar multiplication.

2. How do you determine if a transformation is linear?

A transformation is considered linear if it satisfies two properties:

  1. Additivity - T(u + v) = T(u) + T(v)
  2. Homogeneity - T(αv) = αT(v)
where u and v are vectors and α is a scalar. If a transformation satisfies both of these properties, it is considered linear.

3. What is the difference between a linear transformation and a linear function?

A linear transformation is a broader concept that can apply to any vector space, while a linear function specifically refers to a transformation between two real number spaces. In other words, a linear function is a type of linear transformation.

4. What is linear independence in linear algebra?

Linear independence refers to a set of vectors where no vector can be written as a linear combination of the other vectors. In other words, no vector in the set is redundant and each vector adds unique information to the set.

5. How do you determine if a set of vectors is linearly independent?

To determine if a set of vectors is linearly independent, you can use the following steps:

  1. Arrange the vectors into a matrix, with each vector as a column.
  2. Use Gaussian elimination to reduce the matrix to its echelon form.
  3. If there are no rows of zeros, the vectors are linearly independent. If there are any rows of zeros, the vectors are linearly dependent.
Alternatively, you can also use the determinant of the matrix - if the determinant is non-zero, the vectors are linearly independent.

Similar threads

  • Calculus and Beyond Homework Help
Replies
0
Views
418
  • Calculus and Beyond Homework Help
Replies
8
Views
557
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
956
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
Back
Top