Linear Algebra, Matrix Inverse Proof

  • Thread starter patata
  • Start date
  • #1
patata
10
0

Homework Statement


Let A,B and and 2A + B be invertible n x n matrices. Show that (A^-1 + 2B^-1) is also invertible and express (A^-1 + 2B^-1)^-1 in terms of A, B and (2A+B)^-1

The Attempt at a Solution


I'm not exactly sure how to tackle this problem, i know that for a matrix to be invertible, we have to be able to multiply (both pre and post multiply it) by some other matrix so that their product gives the identity, however, in this case, i'm completely lost as to what to due to the sum of the matrices inside the brackets. Any hints on suggests? Can i let C = the sum of A+2B say to simplify matters? but even if i do that im not sure how to proceed.

I've also been reading on this thing called 'Binomial Inverse Theorum' but again, im not to sure if im barking up the wrong tree or how to apply it to this case.

Thanks for any and all help
 

Answers and Replies

  • #2
patata
10
0
As some attempted working, If i multiply A(A^-1 +2B^-1)2B does that give me I? Essentially what im finding most confusing is how to deal with the terms in the brackets.
 
  • #3
boboYO
106
0
the distributive law holds for matrices;

C(A+B)=CA+CB

so to expand

(A+B)(C+D),

first you expand the left bracket:

A(C+D)+B(C+D)

and then you do the right brackets similarly.
 

Suggested for: Linear Algebra, Matrix Inverse Proof

  • Last Post
Replies
1
Views
532
  • Last Post
Replies
6
Views
431
Replies
4
Views
401
Replies
1
Views
596
Replies
9
Views
228
Replies
8
Views
632
Replies
8
Views
603
  • Last Post
Replies
1
Views
279
Top