Proving eigenvalues and diagonalizability

  • Thread starter Thread starter Yoonique
  • Start date Start date
  • Tags Tags
    Eigenvalues
Yoonique
Messages
105
Reaction score
0
< Mentor Note -- thread moved to HH from the technical math forums, so no HH Template is shown >[/color]

Let A be a square matrix of order n such that A^2 = I
a) Prove that if -1 is the only eigenvalue of A, then A= -I
b) Prove that if 1 is the only eigenvalue of A, then A= I
c) Prove that A is diagonalizable.

For part a and b, I consider A=-I and I then showed that the eigenvalues are -1 and 1 respectively. Is it valid by assuming A=-I and I then proving the eigenvalues are -1 and 1?
If A=-I, Av=λv
-v=λv
therefore, λ=-1.

For part c, I have no clue how to even start. I did some research, and I haven't learn minimal polynomial so I can't use it to prove.
 
Last edited by a moderator:
Physics news on Phys.org
Consider the expression ##(t \cdot E_n - A) \cdot (t \cdot E_n + A)## and the zeros of the characteristic polynomial of ##A## to prove c).
a) and b) are direct consequences of c).
 
Yoonique said:
< Mentor Note -- thread moved to HH from the technical math forums, so no HH Template is shown >

Let A be a square matrix of order n such that A^2 = I
a) Prove that if -1 is the only eigenvalue of A, then A= -I
b) Prove that if 1 is the only eigenvalue of A, then A= I
c) Prove that A is diagonalizable.

For part a and b, I consider A=-I and I then showed that the eigenvalues are -1 and 1 respectively. Is it valid by assuming A=-I and I then proving the eigenvalues are -1 and 1?
No. You are proving the converse of what you're asked for. For a), assume that -1 is the only eigenvalue of A, then show that A must be -I. Same idea for b).
Yoonique said:
If A=-I, Av=λv
-v=λv
therefore, λ=-1.

For part c, I have no clue how to even start. I did some research, and I haven't learn minimal polynomial so I can't use it to prove.
 
Mark44 said:
No. You are proving the converse of what you're asked for. For a), assume that -1 is the only eigenvalue of A, then show that A must be -I. Same idea for b).
Av=λv
Av=-v
Av=-Iv
A=-I
Can I deduce A=-I from that?
 
fresh_42 said:
Consider the expression ##(t \cdot E_n - A) \cdot (t \cdot E_n + A)## and the zeros of the characteristic polynomial of ##A## to prove c).
a) and b) are direct consequences of c).
I didn't learn anything about that in my course. Can you explain that more explicitly? I know the eigenvalue of A are -1 and 1.
 
Yoonique said:
Av=λv
Av=-v
Av=-Iv
A=-I
Can I deduce A=-I from that?
You've omitted what's given, and a step or two. How do you go from equation 2 above to equation 3?
 
Yoonique said:
I didn't learn anything about that in my course. Can you explain that more explicitly? I know the eigenvalue of A are -1 and 1.
##A## is diagonalizable iff there is a matrix ##S## so that ##S^{-1} A S = diag (λ_1, ... , λ_n)##
##diag (λ_1, ... , λ_n)## denotes a diagonal matrix with entries ##λ_1, ... , λ_n## on its diagonal and 0 elsewhere. ##E_n = diag(1, ... , 1)##

The characteristic polynomial ##char_A (t) ## of ##A## in ##t## is defined as ##det ( t \cdot E_n - A )##. Now
## char_A (t) \cdot det ( t \cdot E_n + A )##
##= det ( t \cdot E_n - A ) \cdot det ( t \cdot E_n + A )##
##= det [( t \cdot E_n - A ) \cdot ( t \cdot E_n + A )]##
##= det (t^2 \cdot E_n - A^2)##
##= det (t^2 \cdot E_n - E_n)##
##= (t^2 - 1)^n##
##= (t-1)^n (t+1)^n##

The ##λ_i## are exactly the zeros of ##char_A (t)##. The equation above shows that only ##(t ± 1)## are divisors of ##char_A (t)## and therefore all ##λ_i## must be 1 or -1, proving c) and then a) and b).

I admit there's eventually an easier way to prove this without determinant and characteristic polynomial.
Maybe I'm too used to them that I didn't see it. Which rules about matrices are you expected to use?
 
Last edited:
Mark44 said:
You've omitted what's given, and a step or two. How do you go from equation 2 above to equation 3?
I multiply both sides by identity matrix so it IAv is still Av.
 
fresh_42 said:
##A## is diagonalizable iff there is a matrix ##S## so that ##S^{-1} A S = diag (λ_1, ... , λ_n)##
##diag (λ_1, ... , λ_n)## denotes a diagonal matrix with entries ##λ_1, ... , λ_n## on its diagonal and 0 elsewhere. ##E_n = diag(1, ... , 1)##
I don't believe that the OP is this far along in his/her studies.
Try to keep your hints aligned with what the person you're attempting to help can handle.
fresh_42 said:
The characteristic polynomial ##char_A (t) ## of ##A## in ##t## is defined as ##det ( t \cdot E_n - A )##. Now
## det ( t \cdot E_n - A ) \ cdot det ( t \cdot E_n + A )
= det [( t \cdot E_n - A ) \ cdot ( t \cdot E_n + A )]
= det (t^2 \cdot E_n - A^2)
= det (t^2 \cdot E_n - E_n)
= (t^2 - 1)^n
= (t-1)^n (t+1)^n
 
  • #10
Yoonique said:
I multiply both sides by identity matrix so it IAv is still Av.
Yes, that's clear. What I really meant was how did you go from equation 3 to equation 4?
 
  • #11
Mark44 said:
Yes, that's clear. What I really meant was how did you go from equation 3 to equation 4?
I'm not sure whether it is right. But I compared them like coefficients.
 
  • #12
Yoonique said:
I'm not sure whether it is right. But I compared them like coefficients.
The trouble is, that isn't guaranteed to work, when you're doing matrix multiplication. For example, it's possible for AB = 0, where neither A nor B is the zero matrix.
For example:
##A = \begin{bmatrix}0 & 1 \\ 0 & 0 \end{bmatrix}## and ##B = \begin{bmatrix} 0 & 2 \\ 0 & 0 \end{bmatrix}##

So starting here: Av=-Iv, do you have any other ideas? You know the v is an eigenvector, right? Is there a property of eigenvectors you can use?
 
  • #13
Mark44 said:
The trouble is, that isn't guaranteed to work, when you're doing matrix multiplication. For example, it's possible for AB = 0, where neither A nor B is the zero matrix.
For example:
##A = \begin{bmatrix}0 & 1 \\ 0 & 0 \end{bmatrix}## and ##B = \begin{bmatrix} 0 & 2 \\ 0 & 0 \end{bmatrix}##

So starting here: Av=-Iv, do you have any other ideas? You know the v is an eigenvector, right? Is there a property of eigenvectors you can use?
The only properties I know are that they are linearly independent and they form a basis for R^n.
 
  • #14
Yoonique said:
The only properties I know are that they are linearly independent and they form a basis for R^n.
"They" - for each problem part, you have only one eigenvector.
Can any old vector be an eigenvector, or are there any restrictions?
 
Last edited:
  • #15
If I know that the rank(I-A) + rank(I+A) = n, then does it mean that the dimension of eigenspace of A is n, therefore there are n eigenvectors so A is diagonalizable.
 
  • #16
Mark44 said:
"They" - for each problem part, you have only one eigenvector.
Can any old vector be an eigenvector, or are there any restrictions?
Eigenvector cannot be the zero vector?
 
Last edited by a moderator:
  • #17
Yoonique said:
Eigenvector cannot be the zero vector?
Yes!
OK, now starting from here -- Av = -Iv, can you get another equation and conclude that A = -I?
 
  • #18
Mark44 said:
Yes!
OK, now starting from here -- Av = -Iv, can you get another equation and conclude that A = -I?
Another equation? (A+I)v=0?
 
  • #19
Yoonique said:
Another equation? (A+I)v=0?
That's what I was steering you toward. With what you know about v, and what you know about A (from post #1), why can you conclude that A + I must be the zero matrix?

Note that because of the way matrix multiplication works, it's possible for Bx = 0, with ##\vec{x} \ne \vec{0}## and B not equal to the zero matrix. For example, with ##A = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}## and ##\vec{x} = \begin{bmatrix} 1 \\ 0 \end{bmatrix}##.
 
  • #20
Mark44 said:
Try to keep your hints aligned with what the person you're attempting to help can handle.
mea culpa. You're right.
 
  • #21
Mark44 said:
That's what I was steering you toward. With what you know about v, and what you know about A (from post #1), why can you conclude that A + I must be the zero matrix?

Note that because of the way matrix multiplication works, it's possible for Bx = 0, with ##\vec{x} \ne \vec{0}## and B not equal to the zero matrix. For example, with ##A = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}## and ##\vec{x} = \begin{bmatrix} 1 \\ 0 \end{bmatrix}##.
I have no clue for this :/
 
  • #22
Yoonique said:
I have no clue for this :/
Oh wait. I think I have an idea. Does it mean that rank(A+I)=0?
 
  • #23
Yoonique said:
Oh wait. I think I have an idea. Does it mean that rank(A+I)=0?
Yes, but I'm not sure that is helpful.

You are given that ##A^2 = I##. You definitely want to use that information.
 
  • #24
Yoonique said:
Oh wait. I think I have an idea. Does it mean that rank(A+I)=0?
If you find it, tell me. I'm blocked, too. What Mark wrote was that you cannot conclude from (A+I)v = 0 that v = 0 or A+I=0 (see his example with Ax=0). The difficulty is to show that it holds anyway. Somewhere ## A^2 = 1 ## or (A+I)(A-I)=0 should be used.
 
  • #25
fresh_42 said:
If you find it, tell me. I'm blocked, too. What Mark wrote was that you cannot conclude from (A+I)v = 0 that v = 0 or A+I=0 (see his example with Ax=0). The difficulty is to show that it holds anyway. Somewhere ## A^2 = 1 ## or (A+I)(A-I)=0 should be used.
(A+I)v=(A+I)(A-I)? I don't know how can it help me to conclude A+I=0
 
  • #26
fresh_42 said:
Somewhere ## A^2 = 1 ## or (A+I)(A-I)=0 should be used.
That would be ##A^2 = I## rather than ##A^2 = 1##.
If ##A^2 = I##, then ##|A^2| = ?##
 
  • #27
Mark44 said:
That would be ##A^2 = I## rather than ##A^2 = 1##.
If ##A^2 = I##, then ##|A^2| = ?##
I assume |A^2| means determinant? Then determinant of A is 1 or -1. If it means length then it is 1.
 
  • #28
Yoonique said:
I assume |A^2| means determinant?
Yes, that's standard notation for the matrix determinant.
Yoonique said:
Then determinant of A is 1 or -1. If it means length then it is 1.
The determinant doesn't measure the "length" of a matrix.
It's useful to determine whether a matrix has an inverse (##|A| \ne 0##) or doesn't have an inverse (##|A| = 0##).

However, the important fact here is that ##A^2 = I##. If the product of two square matrices is I, then the two matrices are inverses of each other. What does this fact imply about A?

BTW, if it seems like I've changed course on this, I have. I was thinking that if ##A^2 = I##, then ##|A^2| = 1##, from which it follows that ##|A| = \pm 1## which is true. I was then thinking that it must be the case that A = I or A = -I, which does not necessarily follow. A counterexample is this matrix:
##B = \begin{bmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{bmatrix}##
For this matrix, |B| = 1 for any value of ##\theta##, but ##B \ne I##, in general.
 
Last edited:
  • #29
Mark44 said:
Yes, that's standard notation for the matrix determinant.
The determinant doesn't measure the "length" of a matrix.
It's useful to determine whether a matrix has an inverse (##|A| \ne 0##) or doesn't have an inverse (##|A| = 0##).

However, the important fact here is that ##A^2 = I##. If the product of two square matrices is I, then the two matrices are inverses of each other. What does this fact imply about A?

BTW, if it seems like I've changed course on this, I have. I was thinking that if ##A^2 = I##, then ##|A^2| = 1##, from which it follows that ##|A| = \pm 1## which is true. I was then thinking that it must be the case that A = I or A = -I, which does not necessarily follow. A counterexample is this matrix:
##B = \begin{bmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{bmatrix}##
For this matrix, |B| = 1 for any value of ##\theta##, but ##B \ne I##, in general.
A is invertible?
 
  • #30
Yoonique said:
A is invertible?
Yes.
##A^2 = A \cdot A = 1## (or I or ##E_n## or whatever the unity matrix may be denoted) means that ##A## is it's own inverse.
Just as ## (-1) \cdot (-1) = 1##.
Let us first assume all eigenvalues are +1.

By multiplying we see that ##(A^2 - 1) = (A + 1)(A - 1) = 0##.
What does this mean for ## im(A - 1) = \{ v | v = ( A - 1)w ## for a ## w \}## and ## ker(A + 1) = \{ v | (A + 1)v = 0 \}##?
Can you calculate ## ker(A + 1) = \{ v | (A + 1)v = 0 \}##?
 
Last edited:
Back
Top