Problem on linear independence and matrices

Click For Summary

Discussion Overview

The discussion revolves around the linear independence of a transformed set of vectors {Av1, Av2, ...Avn} derived from a linearly independent set {v1, v2, ...vn} when subjected to a singular matrix A. Participants explore the implications of linear transformations on vector independence, the definition of singular matrices, and provide examples to illustrate their points.

Discussion Character

  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants propose that since {v1, v2, ...vn} is linearly independent, the transformed set {Av1, Av2, ...Avn} must also be independent unless the singular matrix A alters this property.
  • Others argue that a singular matrix can change the property of linear independence, suggesting that the transformed set cannot be independent.
  • A participant presents a mathematical argument showing that if the transformation leads to a non-trivial solution in a homogeneous system, the transformed vectors must be dependent.
  • Counterexamples are suggested, including the case where A is the zero matrix, which would lead to the transformed set being dependent.
  • Some participants question the assumption that the original set forms a basis, noting that linear independence does not necessarily imply basis status without additional context.
  • There is a discussion about the implications of having a trivial solution in the context of linear independence.
  • Participants highlight the need for clarity on definitions, particularly regarding what constitutes a singular matrix and its effects on vector sets.

Areas of Agreement / Disagreement

Participants do not reach a consensus on whether the transformed set {Av1, Av2, ...Avn} is linearly independent. Multiple competing views remain, with some asserting dependence due to the singular matrix and others providing counterexamples that suggest independence is possible under certain conditions.

Contextual Notes

Limitations include the assumption that the original set {v1, v2, ...vn} is a basis, which is not universally accepted in the discussion. The implications of singular matrices on linear independence are also not fully resolved, with various interpretations presented.

vince89
Messages
6
Reaction score
0
Can I ask for some help?

Suppose that {v1,v2...vn} is a linearly independent set of vectors and A is a singular matrix.
Prove or disprove: The set {Av1, Av2, ...Avn} is linearly independent.
 
Physics news on Phys.org
How can you relate two different basis?
 
the vectors are the same. and they're not bases, they're just sets of vectors
 
But the vetors of the first set are linearly independent, so they form a basis. If the second set has linearly independent members it should also be a base. Correct?
 
but doesn't the singular matrix change the property of linear independence?
 
So the members of the second set can not be independent! :smile:
 
hmm...i don't quite get it. can you write a rough proof. :) thanks a lot!
 
Ok!
The first set forms a basis since the vectors are linearly independent.

Suppose now that the members of the 2nd set are linearly independent, thus they also form a basis. Two basis are related by a non-singular matrix, but in our case they related by a singular one. Thus the members of the 2nd are not linearly independent.

What about that?
 
how does the singular matrix change the linear independence of the basis?
 
  • #10
Let me show it with equations.

Call the vectors of the 2nd set [itex]\bar{v}^\alpha[/itex], then

[tex]\bar{v}^\alpha=A^\alpha_\beta\,v^\beta[/tex]

In order for [itex]\bar{v}^\alpha[/itex] to be linearly independent, it must hold that

[tex]\lambda_\alpha \, \bar{v}^\alpha=0\Rightarrow \lambda_{\alpha}=0 \quad \forall \alpha[/tex]

Now we have

[tex]\lambda_\alpha \, \bar{v}^\alpha=0\Rightarrow \lambda_\alpha \, A^\alpha_\beta\,v^\beta=0 \Rightarrow \lambda_\alpha \, A^\alpha_\beta=0[/tex]

The last equality hols because [itex]v^\alpha[/tex] are linearly independent.<br /> <br /> This is a [itex]n\times n[/itex] <i>homogeneous</i> system for the unknows [itex]\lambda_\alpha[/itex]. In order for that to have only the trivial solution it must hold [itex]det(A)\neq 0[/itex]. But [itex]det(A)= 0[/itex] so there is a solution for [itex]\lambda_\alpha[/itex] besides the trivial one. Thus the vectors [itex]\bar{v}^\alpha[/itex] are dependent.[/itex]
 
Last edited:
  • #11
if there is a trivial solution, what does it imply? :) thanks btw.
 
  • #12
How about just trying for a counter example? What is the simplest singular linear transformation you know?
 
  • #13
Trivial solution means [tex]\lambda^\alpha=0[/tex], which would cause [tex]\bar{v}^\alpha[/tex] to be independent
 
  • #14
Hello,

I hope I am not spoiling the fun, but I think things are getting confused.

Anything could happen here. Since we are not requiring that {v_1, ..., v_n} is a basis, here are two examples. Take any set of linearly independent vectors and let A = 0. Then of course their images are not linearly independent. On the other hand, take A to be some nonzero matrix, and let v_1 be any vector not in its kernel. Then {v_1} is linearly independent, and so is {Av_1}.
 
  • #15
masnevets said:
Hello,

I hope I am not spoiling the fun, but I think things are getting confused.

Anything could happen here. Since we are not requiring that {v_1, ..., v_n} is a basis, here are two examples. Take any set of linearly independent vectors and let A = 0. Then of course their images are not linearly independent. On the other hand, take A to be some nonzero matrix, and let v_1 be any vector not in its kernel. Then {v_1} is linearly independent, and so is {Av_1}.

The OP was:

Suppose that {v1,v2...vn} is a linearly independent set of vectors and A is a singular matrix.

Since the [itex]v_i[/itex] are linearly independent, then they form a basis. Assuming of course, the dimension of the vector space is n. :smile:
 
  • #16
All you really need is a 'counterexample'. If {v1, v2, ..., vn} is a set of independent vectors, and A is the linear transformation that takes every v into 0, what can you say about {Av1, Av2, ..., Avn}?
 
  • #17
whats your definition of singular? if it means the columns are not independent, you are done. at least assuming you know the basic theory of dimension.
 
  • #18
Rainbow Child said:
The OP was:



Since the [itex]v_i[/itex] are linearly independent, then they form a basis. Assuming of course, the dimension of the vector space is n. :smile:
Yes, and masnevets' point was that there is no reason to assume that! In any case, his point was an extension of what I said: Suppose {v1, v2, ...} is a set of independent vectors and A is the zero operator (Av= 0 for all v).
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K