Proving Basis of Av with Invertible A Matrix

In summary, the conversation is about proving that (Av1,Av2,...,Avn) is a basis for Rn, given that A is an invertible matrix and (v1,v2,...,vn) is a basis for Rn. The participants discuss different approaches to proving that the set spans and is linearly independent, and mention using the fact that A is invertible and its kernel to show that the coefficients must be zero.
  • #1
Chris Rorres
4
0
If A is an invertible matrix and vectors (v1,v2,...,vn) is a basis for Rn, prove that (Av1,Av2,...,Avn) is also a basis for Rn.
 
Physics news on Phys.org
  • #2
Hi Chris! :wink:

Show us how far you get, and where you're stuck, and then we'll know how to help! :smile:
 
  • #3
why don't threads like this get moved to homework?
 
  • #4
Just lazy mentors!
 
  • #5
I need to prove that (Av1,Av2,...,Avn) spans and that it is linearly independent but this proof is so confusing to me that i don't even know where to start doing that.
 
  • #6
You need to show that b* A(v_1) + ... b_n A(v_n) = 0 implies b_1 ... b_n equals zero, right? Well, you know since A is invertible, what is it's kernal?
 

1. What is the basis of A with an invertible A matrix?

The basis of A with an invertible A matrix is the set of linearly independent columns of A. This means that each column can be expressed as a linear combination of the other columns, and no column is redundant.

2. How do you prove that A has an invertible matrix?

To prove that A has an invertible matrix, we need to show that its columns are linearly independent. This can be done by setting up a system of equations using the columns of A and solving for the variables. If the only solution is the trivial solution (all variables equal to 0), then the columns are linearly independent and A has an invertible matrix.

3. Can A have an invertible matrix if its columns are not linearly independent?

No, if the columns of A are not linearly independent, then A cannot have an invertible matrix. This is because the columns of an invertible matrix must form a basis, and a basis must consist of linearly independent vectors.

4. What is the relationship between the invertibility of A and its determinant?

The determinant of A is a measure of how much the volume of a parallelepiped changes when transformed by A. A has an invertible matrix if and only if its determinant is non-zero, meaning that the transformation does not collapse the volume of the parallelepiped to 0. In other words, if A has an invertible matrix, its determinant must be non-zero.

5. How does the invertibility of A affect its solution to a system of linear equations?

If A has an invertible matrix, then the system of linear equations can be solved for a unique solution using matrix operations. This is because an invertible matrix can be used to transform the system into an equivalent system with a unique solution. However, if A does not have an invertible matrix, then the system may have either no solution or infinitely many solutions.

Similar threads

  • Calculus and Beyond Homework Help
Replies
17
Views
3K
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
805
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
2
Replies
40
Views
3K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
889
  • Linear and Abstract Algebra
Replies
7
Views
1K
  • Precalculus Mathematics Homework Help
Replies
17
Views
2K
  • Calculus and Beyond Homework Help
Replies
7
Views
2K
Back
Top