Proving Invertible Matrix A Creates New Basis from {X1,X2,X3..Xn}

  • Context: Undergrad 
  • Thread starter Thread starter mitch_1211
  • Start date Start date
  • Tags Tags
    Basis Matrix
Click For Summary

Discussion Overview

The discussion revolves around whether multiplying an invertible matrix A by a basis set {X1, X2, X3, ..., Xn} results in a new basis {AX1, AX2, AX3, ..., AXn}. The scope includes theoretical aspects of linear algebra, specifically focusing on the properties of bases and linear transformations.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant questions how to approach proving that the set {AX1, AX2, AX3, ..., AXn} forms a basis when A is an invertible matrix.
  • Another participant asserts that if A is an invertible n by n matrix and {vn} is a basis for an n-dimensional vector space, then {Avn} is also a basis.
  • It is noted by a different participant that the assertion holds only if A is a linear map; otherwise, the resulting set may not form a basis for the target space.
  • Several participants engage in a mathematical exploration of linear independence, suggesting that if the transformation is valid, the only solution to the linear combination equating to zero would be the trivial solution.

Areas of Agreement / Disagreement

Participants express differing views on the conditions under which the transformed set {AX1, AX2, AX3, ..., AXn} constitutes a basis. While some agree on the validity of the transformation under certain conditions, others highlight potential exceptions based on the nature of the transformation.

Contextual Notes

There are unresolved assumptions regarding the nature of the transformation A, particularly whether it is a linear map and the implications of dimensionality between the vector spaces involved.

mitch_1211
Messages
95
Reaction score
1
does multiplying the invertible matrix A to the basis {X1,X2,X3..Xn} create a new basis; {AX1,AX2,Ax3..AXn}? where Xn are matrices

I can prove that for eg if {v1,v2,v3} is a basis then {u1,u2,u3} is a basis where u1=v1 u2=v1+v2, u3=v1+v2+v3

I setup the equation c1(u1)+c2(u2)+c3(u3)=0 and determine if the only solutions are c1=c2=c3=0 or if there are others. This then gives linear independence or dependence and because there are the right number of vectors (3 u's and 3v's) spanning will automatically follow if vectors are linearly independent.

I'm not sure how to approach the matrix basis case...
 
Physics news on Phys.org


If A is an invertible n by n matrix and {vn} is a basis for n dimensional vector space then, yes, {Avn} is also a basis.
 


This is only true if the transformation A is a linear map, i.e., a vector space isomorphism; otherwise, one may have either that Axi=Axj for A:V-->V , i.e.,
from a vector space to itself, or if you have A:V-->W, with dimV<dimW, then , even if Axi=/Axj , the set {Ax1,...,Axn} will not be a basis for W, tho it will be a basis for
A(V).
 


c1(Au1)+c2(Au2)+c3(Au3)=0
A(c1(u1)+c2(u2)+c3(u3))=0
c1(u1)+c2(u2)+c3(u3)=0
c1=0,c2=0,c3=0
 


td21 said:
c1(Au1)+c2(Au2)+c3(Au3)=0
A(c1(u1)+c2(u2)+c3(u3))=0
c1(u1)+c2(u2)+c3(u3)=0
c1=0,c2=0,c3=0

Thank you everyone for all those ideas and thank you td21 for making me realize what a simple task this is, A is invertible => therefor A is non-zero, u forms a basis so is also non-zero so the only solution would be c1=c2=c3=0
 

Similar threads

  • · Replies 59 ·
2
Replies
59
Views
10K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 8 ·
Replies
8
Views
15K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
15K
  • · Replies 5 ·
Replies
5
Views
3K
Replies
17
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
17
Views
3K