Solving Linear Algebra Problem: Proving A_t is Invertible

In summary: A and B are invertible, you will see that the equation simplifies to A^-1B^-1 = I. So even though the proof was a little more complicated than you initially thought, it is still possible to solve the problem.In summary, Invertible matrices can be simplified by knowing that their inverse exists. Once you know this, the equation for A^-1B^-1 is easy to solve.
  • #1
ak416
122
0
Im just working through some problems trying to learn linear algebra on my own. Here's one I am having trouble with: Let A be invertible. Prove that A_t is invertible and (A_t)-1 = (A-1)_t ,where _t means transpose. -1 means inverse. I started trying to look at all the entries of general n by m matrices but it looks very tedious. Any trick here?
 
Physics news on Phys.org
  • #2
It's an easy problem when you try it with linear algebra, instead of trying to do it as a computational matrix problem! What is the definition of inverse? What are the algebraic properties of transpose?
 
  • #3
Ya actually its pretty easy using the fact that (AB)_t = B_t A_t because AA^-1 = A^-1A = I_n so taking the transpose of these gives the answer.

Heres another one: Prove that if A is invertible and AB = 0 then B=0. I looked at it through the definition of matrix multiplication and its clear that A cannot be 0, but I still don't see why B has to be zero.
 
  • #4
Have you tried using the fact A is invertible?
 
  • #5
Ya of course, but all i know is that there exists an A^-1 st AA^-1 = A^-1A = In so the sum in each entry of AA^-1 = 1 (when row equals column). So A is clearly not zero but when you look at the entries of AB you know that the sum of each entry has to equal zero but that can happen in many ways (with negatives of the sum cancelling out with positives) so I am not sure.
 
  • #6
Well, you know that "invertible" means is that [itex]A A^{-1} = A^{-1} A = I[/itex], right? That would suggest to me that you ought to do something to your equation so that [itex]A A^{-1}[/itex], [itex]A^{-1} A[/itex], or [itex]I[/itex] appear somewhere in it!
 
  • #7
oo ok that was easy. AB = 0, A^-1AB = 0, B = 0 ! Am i right?
 
  • #8
Yep!

Of course, the full proof would be:

[tex]AB = 0[/tex]
[tex]A^{-1} AB = A^{-1} 0[/tex]
[tex]A^{-1} AB = 0[/tex]
[tex]I B = 0[/tex]
[tex]B = 0[/tex]

where I've made sure to do one step at a time.

I expect you already know these little details, and are aware you're skipping them for brevity, but I just want to make extra sure!
 
  • #9
Ya thanks for that. I have one more question that's kinda bugging me. Let A and B be nxn matrices st AB is invertible. Prove that A and B are invertible. Give an example to show that arbitrary matrices A and B need not be invertible if AB is invertible.

The first part I am not sure I know that AB(AB)^-1 = (AB)^-1(AB) = I . The only thing I can think of is switching (AB)^-1 to B^-1A^-1 but that can only be done once you know there exists A^-1, B^-1. So Whats the best approach? To try to prove that an A^-1 exists, Or to guess and try out something that may be an A^-1 and show that it works.
Also, the second part I don't know what they mean by arbitrary. I thought its a fact that only n by n matrices are invertible.
 
  • #10
So Whats the best approach? To try to prove that an A^-1 exists, Or to guess and try out something that may be an A^-1 and show that it works.
You won't know until you try. :smile: (I know how to turn either one of these ideas into a proof, but I don't know how much you've learned yet)

I thought its a fact that only n by n matrices are invertible.
Yep. That's true...
 
  • #11
well i managed to get (AB)^-1 = I and so AB = I. If i can somehow get BA = I ... (or am i off track?)

edit:never mind kinda messed up my algebra.
 
Last edited:
  • #12
Ok i managed to do it but I had to use the fact that for matrices, if AB = CB then A = C. Is this true?
 
  • #13
Unfortunately, no. :frown: What was your work? Maybe it can still be salvaged?
 
  • #14
Well actually its more of a special case. if AB = IB then A = I. I am sure that's true right?
 
  • #15
Nope. What if, for example, B = 0? (Or is diagonal with some entries zero)
 
  • #16
Well what i did was: A(B(AB)^-1) = I (im guessing B(AB)^-1 is the inverse of A). Now, B(AB)^-1(AB) = BI = IB and what i thought i could do was just cancel out the B on the right and get B(AB)^-1 A = I (Similarily for the inverse of B). So, any tips?
 
  • #17
Oh bleh! Of course this one won't be so simple! Sorry, I didn't think far enough ahead

Did you notice that you haven't used the fact A and B are square matrices? That is a crucial assumption, so you will not be able to do the problem unless you use it in some way.

Incidentally, the work you did is still valuable: when you know that AB = I, we say that B is a right inverse of A, and that A is a left inverse of B. (So, the inverse of a matrix A is simply something that is both a right and a left inverse of A)

Left and right inverses can be important when working with nonsquare matrices. For example, you might want to solve the equation Ax=y, but A is not a squrae matrix! Fortunately, if A is left invertible, you could just multiply by one of its left inverses (say, B) and get x=By.

(An example is when solving a system of linear equations when you have more unknowns than equations. Typically, the coefficient matrix is left-invertible. You will probably not be taught to think of it this way, though!)


Bleh, then all the ways I know to do this problem involve something that isn't completely trivial. The most straightforward is with determinants. (If you've learned them)

While you don't have a theorem that AB = CB --> A = C, you do have the following:

If Av = Bv for every vector v, then A = B. (Similarly, if AC = BC for every matrix C, then A = B)

Also, if A is a non-invertible square matrix, then Av=0 has a nontrivial solution. You could probably use that to do this problem too.
 

1. What does "proving A_t is invertible" mean in linear algebra?

In linear algebra, proving that A_t is invertible means showing that there exists a matrix A_t^-1 such that A_t * A_t^-1 = I, where I is the identity matrix. This means that multiplying A_t by its inverse results in the identity matrix, which is the equivalent of the number 1 in linear algebra. In other words, proving A_t is invertible means showing that A_t has a unique solution for every vector b in the equation A_t * x = b.

2. Why is it important to prove that A_t is invertible in linear algebra?

Proving that A_t is invertible is important in linear algebra because it guarantees that the matrix A_t has a unique solution for every vector b in the equation A_t * x = b. This is essential in solving systems of linear equations, as it ensures that the solution is accurate and can be found without any errors. Additionally, invertible matrices have many useful properties that make them essential in various applications, such as finding eigenvalues and eigenvectors.

3. What is the process for proving that A_t is invertible in linear algebra?

The process for proving that A_t is invertible in linear algebra involves several steps. First, we need to show that the determinant of A_t is non-zero, as this is a necessary and sufficient condition for invertibility. Then, we need to use this determinant to find the inverse of A_t using the adjugate matrix. Finally, we can verify that the inverse we found is correct by multiplying it with A_t and ensuring that the result is the identity matrix.

4. Can A_t be invertible if it is a singular matrix?

No, A_t cannot be invertible if it is a singular matrix. A singular matrix is one that does not have an inverse, meaning that it is not possible to find a matrix that, when multiplied by A_t, results in the identity matrix. This is because the determinant of a singular matrix is equal to zero, which is a necessary condition for invertibility. Therefore, if A_t is a singular matrix, it is not possible to prove that it is invertible in linear algebra.

5. Are there any other methods for proving that A_t is invertible in linear algebra?

Yes, there are alternative methods for proving that A_t is invertible in linear algebra. One method is to show that the rank of A_t is equal to its number of columns, which is also a necessary and sufficient condition for invertibility. Another method is to use the Gaussian elimination algorithm to transform A_t into the identity matrix, as this also guarantees that A_t is invertible. However, the most common and efficient method is to use the determinant and adjugate matrix to find the inverse of A_t.

Similar threads

  • Calculus and Beyond Homework Help
Replies
25
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
936
  • Calculus and Beyond Homework Help
Replies
10
Views
993
  • Calculus and Beyond Homework Help
Replies
14
Views
583
  • Calculus and Beyond Homework Help
Replies
1
Views
450
  • Calculus and Beyond Homework Help
Replies
2
Views
949
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
608
  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
Back
Top