Solving for Invertibility of (B+I) Given B=B^2

  • Thread starter transcendency
  • Start date
  • #1

Homework Statement


Given a matrix B, if B = B2, is (B+I) invertible?

2. The attempt at a solution

det(B) = 0 or 1

rref(rref(B) + I) is I, so (rref(B) + I) is invertible

if det(B) = 1:
let E1E2...En = B
then E1E2...En(rref(B) + I) = B + E1E2...En

I'm not sure if what I did is even useful =(
 
  • #2
What can you say about det(B + I)?
 
  • #3
If det(B) = 1, then B-1B = B-1B2, B = I
so det(B+I) = det(2I) != 0, so B+I is invertible.

I'm still stuck on if det(B) = 0..

I'm quite sure that if B = B2, then B must be a diagonal matrix with entries being either 1 or 0, but I don't know how to prove it.
 
  • #4
If B= B2 then B2- B= B(B- I)= 0.
 
  • #5
I'm quite sure that if B = B2, then B must be a diagonal matrix with entries being either 1 or 0, but I don't know how to prove it.

That follows from the fact you know its minimal poly divides X^2-X, hence you know all the possible eigenvalues (0 and 1), asndyou know the geometric multiplicity is 0 or 1.

Alternatively remember that a definition of an eigenvalue is:


t is an eigenvalue of X if and only if X-tI is not invertible.
 

Suggested for: Solving for Invertibility of (B+I) Given B=B^2

Replies
3
Views
410
Replies
40
Views
3K
Replies
13
Views
761
Replies
6
Views
628
Replies
1
Views
484
Back
Top