# EigenValues & EigenVectors proofs

Question 1:
Proove that if λ is an eigenvalue of [A], then 1/λ is an eigenvalue of [A][text]{T}[/text]

Question 2
Proove that a square matrices [A] and [A]T have the same Eigenvalues.

Question 3:
Show that |det(A)| is the product of the absolute values of the eigenvalues of
[A]

Question 4:
Let A and B be nonsingular nxn matrices. show that AB^-1 and B^-1A have the same eigenvalues.

Related Calculus and Beyond Homework Help News on Phys.org
I'll do the first one for you. I couldn't read what exactly you asked on the first one. Were you trying to prove that if the eigenvalues of $$A$$ were $$\lambda _i$$ then the eigenvalues of $$A^{ - 1}$$ are $$\frac{1}{{\lambda _i }}$$. I'll prove that. By the definition of an inverse matrix $$AA^{-1}=A^{-1}A=1$$. Now, for every non singular matrix, $$\det A \ne 0$$, you can diagonalize it with its eigenvectors $$A=PDP^{-1}$$. Where

$$D = \left( {\begin{array}{*{20}c} {\lambda _1 } & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & {\lambda _n } \\ \end{array} } \right)$$

From this it should be obvious that $$A^{-1}=PD^{-1}P^{-1}$$. So $$A^{-1}A=PD^{-1}DP^{-1}=1$$ and $$AA^{-1}=PDD^{-1}P^{-1}=1$$. So $$DD^{-1}=1$$ and $$D^{-1}D=1$$. Multiply $$D$$ on the left and right side by

$$\[ B = \left( {\begin{array}{*{20}c} {\frac{1} {{\lambda _1 }}} & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & {\frac{1} {{\lambda _n }}} \\ \end{array} } \right)$$

and you will get $$1$$ so

$$\[ D^{-1} = \left( {\begin{array}{*{20}c} {\frac{1} {{\lambda _1 }}} & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & {\frac{1} {{\lambda _n }}} \\ \end{array} } \right)$$ and this proves that the eigenvalues of $$A^{-1}$$ are $$\frac{1}{{\lambda _n }}$$

Defennder
Homework Helper
For question 2 you'll need to use these properties:
$$(A-B)^T = A^T - B^T$$
$$det A^T = det A$$

Question 3 is interesting. Never seen anything like it. No idea how to do it at present.

For question 4, you start with $$B^{-1}Ax = \lambda x \\$$
Then you multiply by $$A$$ on the left on both sides of the equation. What do you notice?

HallsofIvy
Homework Helper
First, this is clearly homework and so I'm going to move it to the "Calculus and Beyond" homework section. Second, no work was shown at all and Jim Kata should not have given the complete solution. Fortunately the proof he gave was far simply incorrect! In particular, it is NOT true that "every non singular matrix, A, you can diagonalize it with its eigenvectors". There exist many non-singular matrices that cannot be diagonalized.

In fact, the first problem should include the condition "$\lambd \ne 0$" to be true (that is implied in the use of $1/\lambda$ but it should have been said.
There is a very simple proof that if $Ax= \lambda x$ (and $\lambda \ne 0$) then $A^T x= (1/\lambda) x$ that does not use "diagonal matrices" etc. but just the fact that $A^TA= I$.

Last edited by a moderator:
Special thanks to Defennnder and Jim Kata for your contributions. HallsofIvy, I am not asking for people to do my homework for me, The first 3 exercises are from some notes I downloaded from the internet, the fourth question is from the book "Introductory Linear Algebra with Applications" by Bernard Kolman and it's not home work as you think.

I did not post my solution for the sake of keeping the post short and as you can see from my post,I am strugling to format the mathematical notations.

I will appreciate if u could point me toward theorems or give clues that will lead to the solution. In your 2 replies to my posts, it's sounds like you are rebuking me, and that is not needed.

malawi_glenn
Homework Helper
Jim Kata and Bertrandkis: read and follow the rules, it is sipmle, just as HallsofIvy said.

CompuChip
Homework Helper
Is the fourth question even true?
In any case it is true that
$$B^{-1} A v = A B^{-1} v + [B^{-1}, A] v = \lambda v + [B^{-1}, A] v$$
if all the matrix products exist, where [A, B] = A B - B A denotes the commutator, so in first instance I'd only expect the statement to be true if A and B^(-1) commute or any eigenvector of B^{-1}A is also an eigenvector of the commutator.

Defennder
Homework Helper
What's a commutator? Doesn't $$AB^{-1}(Ax) = \lambda(Ax)$$ show they have the same eigenvalues?

There is a very simple proof that if $Ax= \lambda x$ (and $\lambda \ne 0$) then $A^T x= (1/\lambda) x$ that does not use "diagonal matrices" etc. but just the fact that $A^TA= I$.
In doing such aren't you assuming $${\mathbf{A}} \in O\left( n \right)$$. I didn't see any mention of that in the question. I think it's just saying $${\mathbf{A}}\in Gl\left( n \right)$$. I guess I was wrong in assuming that all non singular matrices are diagonalizable by their eigenvectors, but since the question explicitly mentions eigenvalues in the question I think it's fair to assume that in this case the matrix is diagonalizable by its eigenvectors.

morphism
Homework Helper
For 3, it's not too hard to prove that |det(A)| is the absolute value of the constant term of the characteristic polynomial of A. Why does this help?

For 4, we can note that AB-1 - kI = BB-1AB-1 - kBB-1 = B(B-1A - k)B-1. Using this, one can prove that AB-1 and B-1A have the same characteristic polynomial.

HallsofIvy
Homework Helper
In doing such aren't you assuming $${\mathbf{A}} \in O\left( n \right)$$. I didn't see any mention of that in the question. I think it's just saying $${\mathbf{A}}\in Gl\left( n \right)$$. I guess I was wrong in assuming that all non singular matrices are diagonalizable by their eigenvectors, but since the question explicitly mentions eigenvalues in the question I think it's fair to assume that in this case the matrix is diagonalizable by its eigenvectors.
True, I misread the problem and then confused AT with A-1!

As for your last statement, "I think it's fair to assume that in this case the matrix is diagonalizable by its eigenvectors", no, it's not. Just having eigenvectors is not enough. In order to be diagonalizable, a matrix must have a complete set of eigenvectors. That is, a basis for the space consisting entirely of eigenvectors. A diagonal matrix has its diagonal numbers as eigenvalues, each corresponding to the basis vectors (1, 0, 0, ...), (0, 1, 0, ...) etc. If a matrix has two or more duplicate eigenvalues, It may or may not have two (or more) independent eigenvectors corresponding to that eigenvalue.

A simple example is
$$\left[\begin{array}{cc} 1 & 1 \\ 0 & 1 \end{array}\right]$$

It has a "double" eigenvalue, 1, corresponding to eigenvalue (1, 0) (and multiples of that). Since there is no other independent eigenvector, it is not diagonalizable. Obviously, if we could write it as a diagonal matrix, it would have to be
$$\left[\begin{array}{cc} 1 & 0 \\ 0 & 1 \end{array}\right]$$
and that is obviously not equivalent to
$$\left[\begin{array}{cc} 0 & 1 \\ 0 & 0 \end{array}\right]$$

Morphism, your explanation for question 4:
AB-1 - kI = BB-1AB-1 - kBB-1 = B(B-1A - k)B-1. has confused me.
As far as I know, to prove that AB^-1 and B^-1A have the same eigenvalues one has to show that they have the same characteristic polynomial viz
det($$\lambda$$I - AB$$^{-1}$$)=det($$\lambda$$I - B$$^{-1}$$A). This is where I get stuck.

I tried to follow the suggestion by defennnder, starting from $$B^{-1}Ax = \lambda x \\$$
multiply by A on the left on both sides of the equation,
$$A B^{-1}Ax =A \lambda x \\$$ then $$A B^{-1}Ax =\lambda A x \\$$
this yields $$(A B^{-1})Ax =\lambda A x \\$$ and it can be concluded that
$$A B^{-1}=\lambda \\$$. This is not the desired conclusion.

Is there anyone out there who can give a black and white proof of this?

Defennder
Homework Helper
I tried to follow the suggestion by defennnder, starting from $$B^{-1}Ax = \lambda x \\$$
multiply by A on the left on both sides of the equation,
$$A B^{-1}Ax =A \lambda x \\$$ then $$A B^{-1}Ax =\lambda A x \\$$
You've got it right up to here.

Bertrandkis said:
this yields $$(A B^{-1})Ax =\lambda A x \\$$ and it can be concluded that
$$A B^{-1}=\lambda \\$$. This is not the desired conclusion.
No, this doesn't follow. $$AB^{-1}$$ is a nxn matrix, whereas $$\lambda$$ is a constant, an eigenvalue. In general, you cannot simply cancel Ax on both sides of the matrix equation. This is matrix algebra, not algebraic manipulation. Furthermore, now note that your equation $$A B^{-1}Ax =\lambda A x \\$$ shows that $$AB^{-1}$$ has $$\lambda$$ as an eigenvalue but Ax, rather than x as an associated eigenvector.

Thanks Defennnder, Now I am with you. Thank you for your help.

morphism
Homework Helper
Morphism, your explanation for question 4:
AB-1 - kI = BB-1AB-1 - kBB-1 = B(B-1A - k)B-1. has confused me.
As far as I know, to prove that AB^-1 and B^-1A have the same eigenvalues one has to show that they have the same characteristic polynomial viz
det($$\lambda$$I - AB$$^{-1}$$)=det($$\lambda$$I - B$$^{-1}$$A). This is where I get stuck.
Use the fact that det(AB)=det(A)det(B).

Defennder
Homework Helper
For 3, it's not too hard to prove that |det(A)| is the absolute value of the constant term of the characteristic polynomial of A. Why does this help?
How do you start to prove that? Evaluating determinants by co-factor expansions for nxn matrices become very messy.

Dick
Homework Helper
How do you start to prove that? Evaluating determinants by co-factor expansions for nxn matrices become very messy.
The characteristic polynomial is det(A-xI). Put x=0. You don't have to cofactor anything.

Defennder
Homework Helper
Isn't the characteristic polynomial $$det(\lambda I -A)$$? And how is possible to put x=0 (I'm assuming you mean x as a constant, not a column vector) if 0 is not an eigenvalue of A, which holds if A is invertible?

HallsofIvy
Homework Helper
Are you serious? Dick wrote det(A- xI) and you wrote det(A- $\lambda$ I). Those are exactly the same thing with x= $\lambda$. The characteristic polynomial for any matrix is det(A- xI) which clearly has "constant term" det(A). The eigenvalues, $\lambda_1$, $\lambda_2$, etc. are the solutions to the equation det(A- xI)= 0.

Defennder
Homework Helper
Well actually that's not what I meant, but anyway I think I got morphism's point now. I just can't quite figure out why the absolute value of the constant value of characteristic polynomial of A implies that it is the absolute value of the eigenvalues of A.

The roots of the characteristic polynomial are the eigenvalues. The constant term of a polynomial is the product of all the roots. The only thing I'm not clear on is how it's clear that det(A) is the constant term.

HallsofIvy
Homework Helper
??? No one has said that. In fact, it can't be true. You are again trying to set a matrix equal to a number!

Someone got a post in before mine- if was responding to "the absolute value of the constant value of characteristic polynomial of A implies that it is the absolute value of the eigenvalues of A."

Now that you have said absolute value of the product of the eigenvalues of A" it makes sense.

Last edited by a moderator:
Someone said that it's true, sure. I just don't know off the top of my head how to prove it. But given my recent track record, I could just be overlooking something painfully obvious.

Defennder
Homework Helper
I think I got it now. If you draw out the matrix of $$A-\lambda I$$ and do the cofactor expansion to evaluate the det of the matrix, then it becomes apparent that when lambda=0, you'll just be evaluating the absolute value of the det of A. I can't think of a rigorous way to prove it, though.

I think this thread can be closed. Thank you to all who replied.How can I close it?