# EigenValues & EigenVectors proofs

1. Jan 7, 2008

### Bertrandkis

Question 1:
Proove that if λ is an eigenvalue of [A], then 1/λ is an eigenvalue of [A][text]{T}[/text]

Question 2
Proove that a square matrices [A] and [A]T have the same Eigenvalues.

Question 3:
Show that |det(A)| is the product of the absolute values of the eigenvalues of
[A]

Question 4:
Let A and B be nonsingular nxn matrices. show that AB^-1 and B^-1A have the same eigenvalues.

2. Jan 7, 2008

### Jim Kata

I'll do the first one for you. I couldn't read what exactly you asked on the first one. Were you trying to prove that if the eigenvalues of $$A$$ were $$\lambda _i$$ then the eigenvalues of $$A^{ - 1}$$ are $$\frac{1}{{\lambda _i }}$$. I'll prove that. By the definition of an inverse matrix $$AA^{-1}=A^{-1}A=1$$. Now, for every non singular matrix, $$\det A \ne 0$$, you can diagonalize it with its eigenvectors $$A=PDP^{-1}$$. Where

$$D = \left( {\begin{array}{*{20}c} {\lambda _1 } & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & {\lambda _n } \\ \end{array} } \right)$$

From this it should be obvious that $$A^{-1}=PD^{-1}P^{-1}$$. So $$A^{-1}A=PD^{-1}DP^{-1}=1$$ and $$AA^{-1}=PDD^{-1}P^{-1}=1$$. So $$DD^{-1}=1$$ and $$D^{-1}D=1$$. Multiply $$D$$ on the left and right side by

$$\[ B = \left( {\begin{array}{*{20}c} {\frac{1} {{\lambda _1 }}} & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & {\frac{1} {{\lambda _n }}} \\ \end{array} } \right)$$

and you will get $$1$$ so

$$\[ D^{-1} = \left( {\begin{array}{*{20}c} {\frac{1} {{\lambda _1 }}} & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & {\frac{1} {{\lambda _n }}} \\ \end{array} } \right)$$ and this proves that the eigenvalues of $$A^{-1}$$ are $$\frac{1}{{\lambda _n }}$$

3. Jan 7, 2008

### Defennder

For question 2 you'll need to use these properties:
$$(A-B)^T = A^T - B^T$$
$$det A^T = det A$$

Question 3 is interesting. Never seen anything like it. No idea how to do it at present.

For question 4, you start with $$B^{-1}Ax = \lambda x \\$$
Then you multiply by $$A$$ on the left on both sides of the equation. What do you notice?

4. Jan 7, 2008

### HallsofIvy

Staff Emeritus
First, this is clearly homework and so I'm going to move it to the "Calculus and Beyond" homework section. Second, no work was shown at all and Jim Kata should not have given the complete solution. Fortunately the proof he gave was far simply incorrect! In particular, it is NOT true that "every non singular matrix, A, you can diagonalize it with its eigenvectors". There exist many non-singular matrices that cannot be diagonalized.

In fact, the first problem should include the condition "$\lambd \ne 0$" to be true (that is implied in the use of $1/\lambda$ but it should have been said.
There is a very simple proof that if $Ax= \lambda x$ (and $\lambda \ne 0$) then $A^T x= (1/\lambda) x$ that does not use "diagonal matrices" etc. but just the fact that $A^TA= I$.

Last edited: Jan 7, 2008
5. Jan 7, 2008

### Bertrandkis

Special thanks to Defennnder and Jim Kata for your contributions. HallsofIvy, I am not asking for people to do my homework for me, The first 3 exercises are from some notes I downloaded from the internet, the fourth question is from the book "Introductory Linear Algebra with Applications" by Bernard Kolman and it's not home work as you think.

I did not post my solution for the sake of keeping the post short and as you can see from my post,I am strugling to format the mathematical notations.

I will appreciate if u could point me toward theorems or give clues that will lead to the solution. In your 2 replies to my posts, it's sounds like you are rebuking me, and that is not needed.

6. Jan 7, 2008

### malawi_glenn

Jim Kata and Bertrandkis: read and follow the rules, it is sipmle, just as HallsofIvy said.

7. Jan 7, 2008

### CompuChip

Is the fourth question even true?
In any case it is true that
$$B^{-1} A v = A B^{-1} v + [B^{-1}, A] v = \lambda v + [B^{-1}, A] v$$
if all the matrix products exist, where [A, B] = A B - B A denotes the commutator, so in first instance I'd only expect the statement to be true if A and B^(-1) commute or any eigenvector of B^{-1}A is also an eigenvector of the commutator.

8. Jan 7, 2008

### Defennder

What's a commutator? Doesn't $$AB^{-1}(Ax) = \lambda(Ax)$$ show they have the same eigenvalues?

9. Jan 8, 2008

### Jim Kata

In doing such aren't you assuming $${\mathbf{A}} \in O\left( n \right)$$. I didn't see any mention of that in the question. I think it's just saying $${\mathbf{A}}\in Gl\left( n \right)$$. I guess I was wrong in assuming that all non singular matrices are diagonalizable by their eigenvectors, but since the question explicitly mentions eigenvalues in the question I think it's fair to assume that in this case the matrix is diagonalizable by its eigenvectors.

10. Jan 8, 2008

### morphism

For 3, it's not too hard to prove that |det(A)| is the absolute value of the constant term of the characteristic polynomial of A. Why does this help?

For 4, we can note that AB-1 - kI = BB-1AB-1 - kBB-1 = B(B-1A - k)B-1. Using this, one can prove that AB-1 and B-1A have the same characteristic polynomial.

11. Jan 8, 2008

### HallsofIvy

Staff Emeritus
True, I misread the problem and then confused AT with A-1!

As for your last statement, "I think it's fair to assume that in this case the matrix is diagonalizable by its eigenvectors", no, it's not. Just having eigenvectors is not enough. In order to be diagonalizable, a matrix must have a complete set of eigenvectors. That is, a basis for the space consisting entirely of eigenvectors. A diagonal matrix has its diagonal numbers as eigenvalues, each corresponding to the basis vectors (1, 0, 0, ...), (0, 1, 0, ...) etc. If a matrix has two or more duplicate eigenvalues, It may or may not have two (or more) independent eigenvectors corresponding to that eigenvalue.

A simple example is
$$\left[\begin{array}{cc} 1 & 1 \\ 0 & 1 \end{array}\right]$$

It has a "double" eigenvalue, 1, corresponding to eigenvalue (1, 0) (and multiples of that). Since there is no other independent eigenvector, it is not diagonalizable. Obviously, if we could write it as a diagonal matrix, it would have to be
$$\left[\begin{array}{cc} 1 & 0 \\ 0 & 1 \end{array}\right]$$
and that is obviously not equivalent to
$$\left[\begin{array}{cc} 0 & 1 \\ 0 & 0 \end{array}\right]$$

12. Jan 9, 2008

### Bertrandkis

Morphism, your explanation for question 4:
AB-1 - kI = BB-1AB-1 - kBB-1 = B(B-1A - k)B-1. has confused me.
As far as I know, to prove that AB^-1 and B^-1A have the same eigenvalues one has to show that they have the same characteristic polynomial viz
det($$\lambda$$I - AB$$^{-1}$$)=det($$\lambda$$I - B$$^{-1}$$A). This is where I get stuck.

I tried to follow the suggestion by defennnder, starting from $$B^{-1}Ax = \lambda x \\$$
multiply by A on the left on both sides of the equation,
$$A B^{-1}Ax =A \lambda x \\$$ then $$A B^{-1}Ax =\lambda A x \\$$
this yields $$(A B^{-1})Ax =\lambda A x \\$$ and it can be concluded that
$$A B^{-1}=\lambda \\$$. This is not the desired conclusion.

Is there anyone out there who can give a black and white proof of this?

13. Jan 9, 2008

### Defennder

You've got it right up to here.

No, this doesn't follow. $$AB^{-1}$$ is a nxn matrix, whereas $$\lambda$$ is a constant, an eigenvalue. In general, you cannot simply cancel Ax on both sides of the matrix equation. This is matrix algebra, not algebraic manipulation. Furthermore, now note that your equation $$A B^{-1}Ax =\lambda A x \\$$ shows that $$AB^{-1}$$ has $$\lambda$$ as an eigenvalue but Ax, rather than x as an associated eigenvector.

14. Jan 9, 2008

### Bertrandkis

Thanks Defennnder, Now I am with you. Thank you for your help.

15. Jan 9, 2008

### morphism

Use the fact that det(AB)=det(A)det(B).

16. Jan 9, 2008

### Defennder

How do you start to prove that? Evaluating determinants by co-factor expansions for nxn matrices become very messy.

17. Jan 9, 2008

### Dick

The characteristic polynomial is det(A-xI). Put x=0. You don't have to cofactor anything.

18. Jan 9, 2008

### Defennder

Isn't the characteristic polynomial $$det(\lambda I -A)$$? And how is possible to put x=0 (I'm assuming you mean x as a constant, not a column vector) if 0 is not an eigenvalue of A, which holds if A is invertible?

19. Jan 10, 2008

### HallsofIvy

Staff Emeritus
Are you serious? Dick wrote det(A- xI) and you wrote det(A- $\lambda$ I). Those are exactly the same thing with x= $\lambda$. The characteristic polynomial for any matrix is det(A- xI) which clearly has "constant term" det(A). The eigenvalues, $\lambda_1$, $\lambda_2$, etc. are the solutions to the equation det(A- xI)= 0.

20. Jan 10, 2008

### Defennder

Well actually that's not what I meant, but anyway I think I got morphism's point now. I just can't quite figure out why the absolute value of the constant value of characteristic polynomial of A implies that it is the absolute value of the eigenvalues of A.