Proving Invertible Matrix A Creates New Basis from {X1,X2,X3..Xn}

  • Context: Undergrad 
  • Thread starter Thread starter mitch_1211
  • Start date Start date
  • Tags Tags
    Basis Matrix
Click For Summary
SUMMARY

Multiplying an invertible matrix A by a basis {X1, X2, X3, ..., Xn} results in a new basis {AX1, AX2, AX3, ..., AXn} in an n-dimensional vector space, provided that A is a linear map. The discussion confirms that if {vn} is a basis, then {Avn} remains a basis due to the linear independence of the transformed vectors. The proof involves establishing that the only solution to the equation c1(Au1) + c2(Au2) + c3(Au3) = 0 is c1 = c2 = c3 = 0, confirming the vectors are linearly independent.

PREREQUISITES
  • Understanding of linear transformations and isomorphisms
  • Knowledge of basis and linear independence concepts
  • Familiarity with matrix operations, specifically with invertible matrices
  • Basic proficiency in vector space theory
NEXT STEPS
  • Study the properties of linear transformations in vector spaces
  • Learn about the implications of matrix rank and dimension in linear algebra
  • Explore the concept of vector space isomorphisms in greater detail
  • Investigate examples of invertible matrices and their effects on various bases
USEFUL FOR

Mathematicians, students of linear algebra, and anyone interested in understanding the implications of linear transformations on vector spaces.

mitch_1211
Messages
95
Reaction score
1
does multiplying the invertible matrix A to the basis {X1,X2,X3..Xn} create a new basis; {AX1,AX2,Ax3..AXn}? where Xn are matrices

I can prove that for eg if {v1,v2,v3} is a basis then {u1,u2,u3} is a basis where u1=v1 u2=v1+v2, u3=v1+v2+v3

I setup the equation c1(u1)+c2(u2)+c3(u3)=0 and determine if the only solutions are c1=c2=c3=0 or if there are others. This then gives linear independence or dependence and because there are the right number of vectors (3 u's and 3v's) spanning will automatically follow if vectors are linearly independent.

I'm not sure how to approach the matrix basis case...
 
Physics news on Phys.org


If A is an invertible n by n matrix and {vn} is a basis for n dimensional vector space then, yes, {Avn} is also a basis.
 


This is only true if the transformation A is a linear map, i.e., a vector space isomorphism; otherwise, one may have either that Axi=Axj for A:V-->V , i.e.,
from a vector space to itself, or if you have A:V-->W, with dimV<dimW, then , even if Axi=/Axj , the set {Ax1,...,Axn} will not be a basis for W, tho it will be a basis for
A(V).
 


c1(Au1)+c2(Au2)+c3(Au3)=0
A(c1(u1)+c2(u2)+c3(u3))=0
c1(u1)+c2(u2)+c3(u3)=0
c1=0,c2=0,c3=0
 


td21 said:
c1(Au1)+c2(Au2)+c3(Au3)=0
A(c1(u1)+c2(u2)+c3(u3))=0
c1(u1)+c2(u2)+c3(u3)=0
c1=0,c2=0,c3=0

Thank you everyone for all those ideas and thank you td21 for making me realize what a simple task this is, A is invertible => therefor A is non-zero, u forms a basis so is also non-zero so the only solution would be c1=c2=c3=0
 

Similar threads

  • · Replies 59 ·
2
Replies
59
Views
9K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 8 ·
Replies
8
Views
15K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
15K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
17
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
17
Views
3K