Can we prove linear independence with just matrix and vector information?

Click For Summary

Discussion Overview

The discussion revolves around the question of whether it is possible to prove the linear independence of two vectors, V1 and V2, using only information about an nxn matrix A and the relationships A*V1=V1 and A*V2=2*V2. The scope includes theoretical aspects of linear algebra and properties of eigenvectors and eigenvalues.

Discussion Character

  • Exploratory, Technical explanation, Debate/contested

Main Points Raised

  • One participant suggests that the given information is sufficient to prove the linear independence of V1 and V2, citing the linearity of A.
  • Another participant references a general theorem stating that eigenvectors corresponding to different eigenvalues are linearly independent, implying that the vectors in question may fit this criterion.
  • A further contribution details a method of applying A to a linear combination of the vectors to explore their independence, indicating a potential approach to the proof.

Areas of Agreement / Disagreement

Participants express differing views on whether the provided information is sufficient for proving linear independence, with some asserting it is adequate while others imply that additional considerations may be necessary.

Contextual Notes

The discussion does not resolve whether the specific conditions of the vectors and matrix are sufficient for a definitive proof of linear independence, leaving open questions about the assumptions involved.

swaldon
Messages
1
Reaction score
0
Is it possible to prove 2 vectors are linearly independent with just the following information?:

A is an nxn matrix. V1 and V2 are non-zero vectors in Rn such that A*V1=V1 and A*V2 = 2*V2.

Is this enough information, or is more needed to prove the LI of the 2 vectors?
 
Last edited:
Physics news on Phys.org
Yes, this is sufficient. Since A is linear it should scale scalar multiples of a vector by the same factor.
 
This is a special case of a more general theorem that states that any set of eigenvectors of a matrix (linear transformation) are linearly independent if the eigenvectors "belong" to different eigenvalues.
 
Specifically, suppose CV1+ DV2= 0 and apply A to both sides: A(CV1+ DV2)= CAV1+ DAV2= CV1+ 2DV2= 0. Now subtract the first equation from that one.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K