# Problem on linear independence and matrices

Can I ask for some help?

Suppose that {v1,v2...vn} is a linearly independent set of vectors and A is a singular matrix.
Prove or disprove: The set {Av1, Av2, ...Avn} is linearly independent.

How can you relate two different basis?

the vectors are the same. and they're not bases, they're just sets of vectors

But the vetors of the first set are linearly independent, so they form a basis. If the second set has linearly independent members it should also be a base. Correct?

but doesnt the singular matrix change the property of linear independence?

So the members of the second set can not be independent! hmm...i dont quite get it. can you write a rough proof. :) thanks a lot!

Ok!
The first set forms a basis since the vectors are linearly independent.

Suppose now that the members of the 2nd set are linearly independent, thus they also form a basis. Two basis are related by a non-singular matrix, but in our case they related by a singular one. Thus the members of the 2nd are not linearly independent.

how does the singular matrix change the linear independence of the basis?

Let me show it with equations.

Call the vectors of the 2nd set $\bar{v}^\alpha$, then

$$\bar{v}^\alpha=A^\alpha_\beta\,v^\beta$$

In order for $\bar{v}^\alpha$ to be linearly independent, it must hold that

$$\lambda_\alpha \, \bar{v}^\alpha=0\Rightarrow \lambda_{\alpha}=0 \quad \forall \alpha$$

Now we have

$$\lambda_\alpha \, \bar{v}^\alpha=0\Rightarrow \lambda_\alpha \, A^\alpha_\beta\,v^\beta=0 \Rightarrow \lambda_\alpha \, A^\alpha_\beta=0$$

The last equality hols because $v^\alpha[/tex] are linearly independent. This is a [itex]n\times n$ homogeneous system for the unknows $\lambda_\alpha$. In order for that to have only the trivial solution it must hold $det(A)\neq 0$. But $det(A)= 0$ so there is a solution for $\lambda_\alpha$ besides the trivial one. Thus the vectors $\bar{v}^\alpha$ are dependent.

Last edited:
if there is a trivial solution, what does it imply? :) thanks btw.

HallsofIvy
Homework Helper
How about just trying for a counter example? What is the simplest singular linear transformation you know?

Trivial solution means $$\lambda^\alpha=0$$, which would cause $$\bar{v}^\alpha$$ to be independent

Hello,

I hope I am not spoiling the fun, but I think things are getting confused.

Anything could happen here. Since we are not requiring that {v_1, ..., v_n} is a basis, here are two examples. Take any set of linearly independent vectors and let A = 0. Then of course their images are not linearly independent. On the other hand, take A to be some nonzero matrix, and let v_1 be any vector not in its kernel. Then {v_1} is linearly independent, and so is {Av_1}.

Hello,

I hope I am not spoiling the fun, but I think things are getting confused.

Anything could happen here. Since we are not requiring that {v_1, ..., v_n} is a basis, here are two examples. Take any set of linearly independent vectors and let A = 0. Then of course their images are not linearly independent. On the other hand, take A to be some nonzero matrix, and let v_1 be any vector not in its kernel. Then {v_1} is linearly independent, and so is {Av_1}.

The OP was:

Suppose that {v1,v2...vn} is a linearly independent set of vectors and A is a singular matrix.

Since the $v_i$ are linearly independent, then they form a basis. Assuming of course, the dimension of the vector space is n. HallsofIvy
Homework Helper
All you really need is a 'counterexample'. If {v1, v2, ..., vn} is a set of independent vectors, and A is the linear transformation that takes every v into 0, what can you say about {Av1, Av2, ..., Avn}?

mathwonk
Homework Helper
2020 Award
whats your definition of singular? if it means the columns are not independent, you are done. at least assuming you know the basic theory of dimension.

HallsofIvy
Since the $v_i$ are linearly independent, then they form a basis. Assuming of course, the dimension of the vector space is n. 