# Problem on linear independence and matrices

1. Jan 7, 2008

### vince89

Can I ask for some help?

Suppose that {v1,v2...vn} is a linearly independent set of vectors and A is a singular matrix.
Prove or disprove: The set {Av1, Av2, ...Avn} is linearly independent.

2. Jan 7, 2008

### Rainbow Child

How can you relate two different basis?

3. Jan 7, 2008

### vince89

the vectors are the same. and they're not bases, they're just sets of vectors

4. Jan 7, 2008

### Rainbow Child

But the vetors of the first set are linearly independent, so they form a basis. If the second set has linearly independent members it should also be a base. Correct?

5. Jan 7, 2008

### vince89

but doesnt the singular matrix change the property of linear independence?

6. Jan 7, 2008

### Rainbow Child

So the members of the second set can not be independent!

7. Jan 7, 2008

### vince89

hmm...i dont quite get it. can you write a rough proof. :) thanks a lot!

8. Jan 7, 2008

### Rainbow Child

Ok!
The first set forms a basis since the vectors are linearly independent.

Suppose now that the members of the 2nd set are linearly independent, thus they also form a basis. Two basis are related by a non-singular matrix, but in our case they related by a singular one. Thus the members of the 2nd are not linearly independent.

9. Jan 7, 2008

### vince89

how does the singular matrix change the linear independence of the basis?

10. Jan 7, 2008

### Rainbow Child

Let me show it with equations.

Call the vectors of the 2nd set $\bar{v}^\alpha$, then

$$\bar{v}^\alpha=A^\alpha_\beta\,v^\beta$$

In order for $\bar{v}^\alpha$ to be linearly independent, it must hold that

$$\lambda_\alpha \, \bar{v}^\alpha=0\Rightarrow \lambda_{\alpha}=0 \quad \forall \alpha$$

Now we have

$$\lambda_\alpha \, \bar{v}^\alpha=0\Rightarrow \lambda_\alpha \, A^\alpha_\beta\,v^\beta=0 \Rightarrow \lambda_\alpha \, A^\alpha_\beta=0$$

The last equality hols because $v^\alpha[/tex] are linearly independent. This is a [itex]n\times n$ homogeneous system for the unknows $\lambda_\alpha$. In order for that to have only the trivial solution it must hold $det(A)\neq 0$. But $det(A)= 0$ so there is a solution for $\lambda_\alpha$ besides the trivial one. Thus the vectors $\bar{v}^\alpha$ are dependent.

Last edited: Jan 7, 2008
11. Jan 7, 2008

### vince89

if there is a trivial solution, what does it imply? :) thanks btw.

12. Jan 7, 2008

### HallsofIvy

Staff Emeritus
How about just trying for a counter example? What is the simplest singular linear transformation you know?

13. Jan 7, 2008

### Rainbow Child

Trivial solution means $$\lambda^\alpha=0$$, which would cause $$\bar{v}^\alpha$$ to be independent

14. Jan 17, 2008

### masnevets

Hello,

I hope I am not spoiling the fun, but I think things are getting confused.

Anything could happen here. Since we are not requiring that {v_1, ..., v_n} is a basis, here are two examples. Take any set of linearly independent vectors and let A = 0. Then of course their images are not linearly independent. On the other hand, take A to be some nonzero matrix, and let v_1 be any vector not in its kernel. Then {v_1} is linearly independent, and so is {Av_1}.

15. Jan 17, 2008

### Rainbow Child

The OP was:

Since the $v_i$ are linearly independent, then they form a basis. Assuming of course, the dimension of the vector space is n.

16. Jan 17, 2008

### HallsofIvy

Staff Emeritus
All you really need is a 'counterexample'. If {v1, v2, ..., vn} is a set of independent vectors, and A is the linear transformation that takes every v into 0, what can you say about {Av1, Av2, ..., Avn}?

17. Jan 17, 2008

### mathwonk

whats your definition of singular? if it means the columns are not independent, you are done. at least assuming you know the basic theory of dimension.

18. Jan 18, 2008

### HallsofIvy

Staff Emeritus
Yes, and masnevets' point was that there is no reason to assume that! In any case, his point was an extension of what I said: Suppose {v1, v2, ...} is a set of independent vectors and A is the zero operator (Av= 0 for all v).