EigenValues & EigenVectors proofs

In summary: No, I am not assuming that at all. I do not understand what you mean by "\mathbf{A} \in O(n)" or "Gl(n)". I presume you meant "orthogonal" and "general linear". First of all, I did not use the notation {\mathbf{A}} at all. Second, I did not use any "orthogonal" matrix at all and, third, I did not use any "general linear" matrix either. I do not understand why you would think that.defender:I think you are confused. The problem was to prove that if A is a matrix with eigenvalue \lambda then A^(-
  • #1
Bertrandkis
25
0
Question 1:
Proove that if λ is an eigenvalue of [A], then 1/λ is an eigenvalue of [A][text]{T}[/text]


Question 2
Proove that a square matrices [A] and [A]T have the same Eigenvalues.


Question 3:
Show that |det(A)| is the product of the absolute values of the eigenvalues of
[A]

Question 4:
Let A and B be nonsingular nxn matrices. show that AB^-1 and B^-1A have the same eigenvalues.
 
Physics news on Phys.org
  • #2
I'll do the first one for you. I couldn't read what exactly you asked on the first one. Were you trying to prove that if the eigenvalues of [tex]A[/tex] were [tex]\lambda _i[/tex] then the eigenvalues of [tex]A^{ - 1}[/tex] are [tex]\frac{1}{{\lambda _i }}[/tex]. I'll prove that. By the definition of an inverse matrix [tex]AA^{-1}=A^{-1}A=1[/tex]. Now, for every non singular matrix, [tex]\det A \ne 0[/tex], you can diagonalize it with its eigenvectors [tex]A=PDP^{-1}[/tex]. Where

[tex]
D = \left( {\begin{array}{*{20}c}
{\lambda _1 } & \cdots & 0 \\
\vdots & \ddots & \vdots \\
0 & \cdots & {\lambda _n } \\

\end{array} } \right)
[/tex]


From this it should be obvious that [tex]A^{-1}=PD^{-1}P^{-1}[/tex]. So [tex]A^{-1}A=PD^{-1}DP^{-1}=1[/tex] and [tex]AA^{-1}=PDD^{-1}P^{-1}=1[/tex]. So [tex]DD^{-1}=1[/tex] and [tex]D^{-1}D=1[/tex]. Multiply [tex]D[/tex] on the left and right side by

[tex]\[
B = \left( {\begin{array}{*{20}c}
{\frac{1}
{{\lambda _1 }}} & \cdots & 0 \\
\vdots & \ddots & \vdots \\
0 & \cdots & {\frac{1}
{{\lambda _n }}} \\

\end{array} } \right)
[/tex]

and you will get [tex]1[/tex] so

[tex]\[
D^{-1} = \left( {\begin{array}{*{20}c}
{\frac{1}
{{\lambda _1 }}} & \cdots & 0 \\
\vdots & \ddots & \vdots \\
0 & \cdots & {\frac{1}
{{\lambda _n }}} \\

\end{array} } \right)
[/tex] and this proves that the eigenvalues of [tex]A^{-1}[/tex] are [tex]\frac{1}{{\lambda _n }}[/tex]
 
  • #3
For question 2 you'll need to use these properties:
[tex](A-B)^T = A^T - B^T [/tex]
[tex]det A^T = det A[/tex]

Question 3 is interesting. Never seen anything like it. No idea how to do it at present.

For question 4, you start with [tex]B^{-1}Ax = \lambda x \\[/tex]
Then you multiply by [tex]A[/tex] on the left on both sides of the equation. What do you notice?
 
  • #4
First, this is clearly homework and so I'm going to move it to the "Calculus and Beyond" homework section. Second, no work was shown at all and Jim Kata should not have given the complete solution. Fortunately the proof he gave was far simply incorrect! In particular, it is NOT true that "every non singular matrix, A, you can diagonalize it with its eigenvectors". There exist many non-singular matrices that cannot be diagonalized.

In fact, the first problem should include the condition "[itex]\lambd \ne 0[/itex]" to be true (that is implied in the use of [itex]1/\lambda[/itex] but it should have been said.
There is a very simple proof that if [itex]Ax= \lambda x[/itex] (and [itex]\lambda \ne 0[/itex]) then [itex]A^T x= (1/\lambda) x[/itex] that does not use "diagonal matrices" etc. but just the fact that [itex]A^TA= I[/itex].
 
Last edited by a moderator:
  • #5
Special thanks to Defennnder and Jim Kata for your contributions. HallsofIvy, I am not asking for people to do my homework for me, The first 3 exercises are from some notes I downloaded from the internet, the fourth question is from the book "Introductory Linear Algebra with Applications" by Bernard Kolman and it's not home work as you think.

I did not post my solution for the sake of keeping the post short and as you can see from my post,I am strugling to format the mathematical notations.

I will appreciate if u could point me toward theorems or give clues that will lead to the solution. In your 2 replies to my posts, it's sounds like you are rebuking me, and that is not needed.
 
  • #6
Jim Kata and Bertrandkis: read and follow the rules, it is sipmle, just as HallsofIvy said.
 
  • #7
Is the fourth question even true?
In any case it is true that
[tex] B^{-1} A v = A B^{-1} v + [B^{-1}, A] v = \lambda v + [B^{-1}, A] v[/tex]
if all the matrix products exist, where [A, B] = A B - B A denotes the commutator, so in first instance I'd only expect the statement to be true if A and B^(-1) commute or any eigenvector of B^{-1}A is also an eigenvector of the commutator.
 
  • #8
What's a commutator? Doesn't [tex]AB^{-1}(Ax) = \lambda(Ax) [/tex] show they have the same eigenvalues?
 
  • #9
HallsofIvy said:
There is a very simple proof that if [itex]Ax= \lambda x[/itex] (and [itex]\lambda \ne 0[/itex]) then [itex]A^T x= (1/\lambda) x[/itex] that does not use "diagonal matrices" etc. but just the fact that [itex]A^TA= I[/itex].

In doing such aren't you assuming [tex]{\mathbf{A}} \in O\left( n \right)[/tex]. I didn't see any mention of that in the question. I think it's just saying [tex]{\mathbf{A}}\in Gl\left( n \right)[/tex]. I guess I was wrong in assuming that all non singular matrices are diagonalizable by their eigenvectors, but since the question explicitly mentions eigenvalues in the question I think it's fair to assume that in this case the matrix is diagonalizable by its eigenvectors.
 
  • #10
For 3, it's not too hard to prove that |det(A)| is the absolute value of the constant term of the characteristic polynomial of A. Why does this help?

For 4, we can note that AB-1 - kI = BB-1AB-1 - kBB-1 = B(B-1A - k)B-1. Using this, one can prove that AB-1 and B-1A have the same characteristic polynomial.
 
  • #11
Jim Kata said:
In doing such aren't you assuming [tex]{\mathbf{A}} \in O\left( n \right)[/tex]. I didn't see any mention of that in the question. I think it's just saying [tex]{\mathbf{A}}\in Gl\left( n \right)[/tex]. I guess I was wrong in assuming that all non singular matrices are diagonalizable by their eigenvectors, but since the question explicitly mentions eigenvalues in the question I think it's fair to assume that in this case the matrix is diagonalizable by its eigenvectors.
True, I misread the problem and then confused AT with A-1!

As for your last statement, "I think it's fair to assume that in this case the matrix is diagonalizable by its eigenvectors", no, it's not. Just having eigenvectors is not enough. In order to be diagonalizable, a matrix must have a complete set of eigenvectors. That is, a basis for the space consisting entirely of eigenvectors. A diagonal matrix has its diagonal numbers as eigenvalues, each corresponding to the basis vectors (1, 0, 0, ...), (0, 1, 0, ...) etc. If a matrix has two or more duplicate eigenvalues, It may or may not have two (or more) independent eigenvectors corresponding to that eigenvalue.

A simple example is
[tex]\left[\begin{array}{cc} 1 & 1 \\ 0 & 1 \end{array}\right][/tex]

It has a "double" eigenvalue, 1, corresponding to eigenvalue (1, 0) (and multiples of that). Since there is no other independent eigenvector, it is not diagonalizable. Obviously, if we could write it as a diagonal matrix, it would have to be
[tex]\left[\begin{array}{cc} 1 & 0 \\ 0 & 1 \end{array}\right][/tex]
and that is obviously not equivalent to
[tex]\left[\begin{array}{cc} 0 & 1 \\ 0 & 0 \end{array}\right][/tex]
 
  • #12
Morphism, your explanation for question 4:
AB-1 - kI = BB-1AB-1 - kBB-1 = B(B-1A - k)B-1. has confused me.
As far as I know, to prove that AB^-1 and B^-1A have the same eigenvalues one has to show that they have the same characteristic polynomial viz
det([tex]\lambda[/tex]I - AB[tex]^{-1}[/tex])=det([tex]\lambda[/tex]I - B[tex]^{-1}[/tex]A). This is where I get stuck.

I tried to follow the suggestion by defennnder, starting from [tex]B^{-1}Ax = \lambda x \\[/tex]
multiply by A on the left on both sides of the equation,
[tex]A B^{-1}Ax =A \lambda x \\[/tex] then [tex]A B^{-1}Ax =\lambda A x \\[/tex]
this yields [tex](A B^{-1})Ax =\lambda A x \\[/tex] and it can be concluded that
[tex]A B^{-1}=\lambda \\[/tex]. This is not the desired conclusion.

Is there anyone out there who can give a black and white proof of this?
 
  • #13
Bertrandkis said:
I tried to follow the suggestion by defennnder, starting from [tex]B^{-1}Ax = \lambda x \\[/tex]
multiply by A on the left on both sides of the equation,
[tex]A B^{-1}Ax =A \lambda x \\[/tex] then [tex]A B^{-1}Ax =\lambda A x \\[/tex]
You've got it right up to here.


Bertrandkis said:
this yields [tex](A B^{-1})Ax =\lambda A x \\[/tex] and it can be concluded that
[tex]A B^{-1}=\lambda \\[/tex]. This is not the desired conclusion.
No, this doesn't follow. [tex]AB^{-1}[/tex] is a nxn matrix, whereas [tex]\lambda[/tex] is a constant, an eigenvalue. In general, you cannot simply cancel Ax on both sides of the matrix equation. This is matrix algebra, not algebraic manipulation. Furthermore, now note that your equation [tex]A B^{-1}Ax =\lambda A x \\[/tex] shows that [tex]AB^{-1}[/tex] has [tex]\lambda[/tex] as an eigenvalue but Ax, rather than x as an associated eigenvector.
 
  • #14
Thanks Defennnder, Now I am with you. Thank you for your help.
 
  • #15
Bertrandkis said:
Morphism, your explanation for question 4:
AB-1 - kI = BB-1AB-1 - kBB-1 = B(B-1A - k)B-1. has confused me.
As far as I know, to prove that AB^-1 and B^-1A have the same eigenvalues one has to show that they have the same characteristic polynomial viz
det([tex]\lambda[/tex]I - AB[tex]^{-1}[/tex])=det([tex]\lambda[/tex]I - B[tex]^{-1}[/tex]A). This is where I get stuck.
Use the fact that det(AB)=det(A)det(B).
 
  • #16
morphism said:
For 3, it's not too hard to prove that |det(A)| is the absolute value of the constant term of the characteristic polynomial of A. Why does this help?
How do you start to prove that? Evaluating determinants by co-factor expansions for nxn matrices become very messy.
 
  • #17
Defennnder said:
How do you start to prove that? Evaluating determinants by co-factor expansions for nxn matrices become very messy.

The characteristic polynomial is det(A-xI). Put x=0. You don't have to cofactor anything.
 
  • #18
Isn't the characteristic polynomial [tex]det(\lambda I -A)[/tex]? And how is possible to put x=0 (I'm assuming you mean x as a constant, not a column vector) if 0 is not an eigenvalue of A, which holds if A is invertible?
 
  • #19
Are you serious? Dick wrote det(A- xI) and you wrote det(A- [itex]\lambda[/itex] I). Those are exactly the same thing with x= [itex]\lambda[/itex]. The characteristic polynomial for any matrix is det(A- xI) which clearly has "constant term" det(A). The eigenvalues, [itex]\lambda_1[/itex], [itex]\lambda_2[/itex], etc. are the solutions to the equation det(A- xI)= 0.
 
  • #20
Well actually that's not what I meant, but anyway I think I got morphism's point now. I just can't quite figure out why the absolute value of the constant value of characteristic polynomial of A implies that it is the absolute value of the eigenvalues of A.
 
  • #21
The roots of the characteristic polynomial are the eigenvalues. The constant term of a polynomial is the product of all the roots. The only thing I'm not clear on is how it's clear that det(A) is the constant term.
 
  • #22
? No one has said that. In fact, it can't be true. You are again trying to set a matrix equal to a number!

Someone got a post in before mine- if was responding to "the absolute value of the constant value of characteristic polynomial of A implies that it is the absolute value of the eigenvalues of A."

Now that you have said absolute value of the product of the eigenvalues of A" it makes sense.
 
Last edited by a moderator:
  • #23
Someone said that it's true, sure. I just don't know off the top of my head how to prove it. But given my recent track record, I could just be overlooking something painfully obvious.
 
  • #24
I think I got it now. If you draw out the matrix of [tex]A-\lambda I[/tex] and do the cofactor expansion to evaluate the det of the matrix, then it becomes apparent that when lambda=0, you'll just be evaluating the absolute value of the det of A. I can't think of a rigorous way to prove it, though.
 
  • #25
I think this thread can be closed. Thank you to all who replied.How can I close it?
 
  • #26
It can't be closed, only locked. And there's no reason to lock it. Only certain users can affix a [SOLVED] tag to the thread title.
 
  • #27
Dick gave a perfectly fine proof of the fact that det(A) is the constant term of the characteristic polynomial.
 
  • #28
I know that this is a ridiculous amount of time after the questions was asked, but I thought I would help with the first question.
as we all know with eigenvalues Ax=lambdax
the easiest way I have found to show that A^(-1)x=1/lambda x
Is to use matrix multiplcation to get A^(-1) on the right side and then get to your 1/lamda from there.
 
  • #29
Sure that works. Even though most of the people that have contributed to this messed up thread are dead, I'm sure they would have thanked you. :). Just kidding. Thanks.
 
  • #30
Well I thought since I just did that problem in my matrix theory homework that anyone else who googled it like I did would appreciate the hint versus just having the answer given to them. Well maybe most kids won't but I would have! :)
 
  • #31
morphism said:
For 3, it's not too hard to prove that |det(A)| is the absolute value of the constant term of the characteristic polynomial of A. Why does this help?

Defennder said:
How do you start to prove that? Evaluating determinants by co-factor expansions for nxn matrices become very messy.
As said the first time you posted this you don't have to do any evaluation or computation. If [itex]\lambda_1[/itex], [itex]\lambda_2[/itex], ..., [itex]\lambda_n[/itex] are are the eigenvalues of A, then its characteristic polynomial is [itex](x-\lambda_1)(x- \lambda_2)\cdot\cdot\cdot (x- \lambda_n)[/itex]. Multiplying that out, the constant term is clearly [itex]\lambda_1\lambda_2\cdot\cdot\cdot\lambda_n|[/itex]. If you have set up the characteristic equation as [itex]det(A- \lambda I)[/itex]= 0, as I learned, it is obvious that taking [/itex]\lambda= 0[/itex] gives just the determinant. If, as some people learn, you set it up as [itex]det(\lambda I- A)[/itex] it gives the negative of the determinant.

In either case, it is correct to say that the absolute value of the determinant is equal to the absolute value of the constant term of the characteristic equation.
 

Related to EigenValues & EigenVectors proofs

1. What is the definition of an eigenvalue?

An eigenvalue is a scalar value that represents a number by which a given linear transformation can be scaled. It is a special value that, when multiplied by its corresponding eigenvector, results in the same vector but possibly with a different magnitude.

2. How do you find the eigenvalues of a matrix?

To find the eigenvalues of a matrix, you need to solve the characteristic equation det(A-λI) = 0, where A is the matrix and λ is the eigenvalue. This equation will result in a polynomial, and the roots of this polynomial are the eigenvalues of the matrix.

3. What is the significance of eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are important in many areas of mathematics and science, including linear algebra, differential equations, and physics. They are used to describe the behavior of linear transformations and to solve systems of differential equations. In addition, they are used in data analysis and machine learning for dimensionality reduction and feature extraction.

4. How do you prove that a vector is an eigenvector of a matrix?

To prove that a vector is an eigenvector of a matrix, you need to show that it satisfies the equation Av = λv, where A is the matrix, v is the eigenvector, and λ is the corresponding eigenvalue. This can be done by multiplying the vector by the matrix and comparing it to the product of the vector and the eigenvalue.

5. Can a matrix have more than one eigenvalue?

Yes, a matrix can have multiple eigenvalues. In fact, the number of eigenvalues of a matrix is equal to its dimension. However, some matrices may have repeated eigenvalues, which means they have fewer distinct eigenvalues than their dimension.

Similar threads

  • Calculus and Beyond Homework Help
Replies
5
Views
574
  • Calculus and Beyond Homework Help
Replies
2
Views
364
  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
24
Views
855
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
557
  • Calculus and Beyond Homework Help
Replies
2
Views
427
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
158
Back
Top