- #1

- 32

- 0

I don't want how to prove: Matrix A is nilpotent, so A^k=0. Prove that det(A+I)=0.

Thank you so much :-)

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter lukaszh
- Start date

- #1

- 32

- 0

I don't want how to prove: Matrix A is nilpotent, so A^k=0. Prove that det(A+I)=0.

Thank you so much :-)

- #2

matt grime

Science Advisor

Homework Helper

- 9,395

- 4

What you state is trivially false - take A=0, that is nilpotent, and det(0+I) is not zero.

- #3

- 3,472

- 251

By the way, this shows that 0 is an eigenvalue of A. Something stronger is true: 0 is the ONLY eigenvalue of A.

I think lukaszh's problem statement should be: prove that det(A+I) = 1.

- #4

- 32

- 0

- #5

- 3,472

- 251

First, find all the eigenvalues of A+I. Then, use what you know about how to calculate the determinant of a linear map given its eigenvalues.

- #6

- 32

- 0

so, I know that

[tex]A=S\Lambda S^{-1}[/tex]

Eigenvalues of A are [tex]\{\lambda_1,\lambda_2,\cdots,\lambda_n\}[/tex] for [tex]n\times n[/tex] matrix. If I add to both sides identity, then

[tex]A+I=S\Lambda S^{-1}+I=S\Lambda S^{-1}+SS^{-1}=S(\Lambda+I)S^{-1}[/tex]

Its determinant is

[tex]\mathrm{det}(A+I)=\mathrm{det}S(\Lambda+I)S^{-1}=\mathrm{det}(\Lambda+I)=\prod_{j=1}^{n}(\lambda_j+1)[/tex]

I don't know, how to utilize fact [tex]A^k=0[/tex] :-(

- #7

- 3,472

- 251

so, I know that

[tex]A=S\Lambda S^{-1}[/tex]

Eigenvalues of A are [tex]\{\lambda_1,\lambda_2,\cdots,\lambda_n\}[/tex] for [tex]n\times n[/tex] matrix. If I add to both sides identity, then

[tex]A+I=S\Lambda S^{-1}+I=S\Lambda S^{-1}+SS^{-1}=S(\Lambda+I)S^{-1}[/tex]

Its determinant is

[tex]\mathrm{det}(A+I)=\mathrm{det}S(\Lambda+I)S^{-1}=\mathrm{det}(\Lambda+I)=\prod_{j=1}^{n}(\lambda_j+1)[/tex]

I don't know, how to utilize fact [tex]A^k=0[/tex] :-(

[tex]\lambda[/tex] is an eigenvalue of A + I if and only if

[tex](A + I)x = \lambda x[/tex]

for some nonzero vector x. This is true if and only if

[tex]Ax = (\lambda - 1)x[/tex]

which is true if and only if [tex]\lambda - 1[/tex] is an eigenvalue of A.

I claim that [tex]A^k = 0[/tex] implies that all of the eigenvalues of A are zero. If you can prove this claim then it implies the result you want.

- #8

- 32

- 0

Beautiful, thank you.

Share: