Proving A & AT Share Same Eigenvalue

  • Thread starter evilpostingmong
  • Start date
  • Tags
    Eigenvalue
In summary: The first justification is that if A=cI, then X is a zero matrix, and 0 is an eigenvalue for every vector. The second justification is that if A-cI is nonzero but singular, then as long as the number of rows the column matrix (representing a vector) equals the number of columns in A and cI, Av would equal cIv so Av-cIv=0. The third justification is that if A-cI were to be nonsingular, then (A-cI)v would not be zero, so we would have Av=/=cIv so in this case v is not an eigenvector of A.
  • #1
evilpostingmong
339
0

Homework Statement



Show that A and AT share the same eigenvalue.

Homework Equations


The Attempt at a Solution


let v be the eigenvector
Av=Icv
since ATv=ITcv
and IT=I,
ATv=Icv
so ATv=Icv=Av
so A and AT must have the same eigenvalue.
 
Last edited:
Physics news on Phys.org
  • #2
evilpostingmong said:
Av=Icv
since ATv=ITcv

How did you go from the first to the second step?
 
  • #3
Cyosis said:
How did you go from the first to the second step?

If Av=Icv, then the transpose of Icv is Icv since the transpose of I is I.
Now (Icv)T=(Acv)T since Icv=Av
since neither c nor v are matrices, they get pushed aside to
get ITcv=ATv
 
  • #4
Yes, but a column vector is an mx1-matrix. When you transpose an mx1 matrix you get a 1xm matrix, called a row vector. My point is v and v transposed aren't exactly the same.
 
  • #5
Cyosis said:
Yes, but a column vector is an mx1-matrix. When you transpose an mx1 matrix you get a 1xm matrix, called a row vector. My point is v and v transposed aren't exactly the same.

Oh okay. So (Av)T= (Ivc)T=(Ic)T(v)T=c(I)T(v)T=cI(v)T which has the same eigenvalue
as A, so AT and A both have the same eigenvalue.
 
  • #6
I don't quite see how you drew your conclusion. [itex](Av)^T=v^T A^T[/itex]. Just try to compute [itex]I v^T[/itex] with a small vector, you will notice they are incompatible.
 
  • #7
Cyosis said:
I don't quite see how you drew your conclusion. [itex](Av)^T=v^T A^T[/itex]. Just try to compute [itex]I v^T[/itex] with a small vector, you will notice they are incompatible.

ok how about this: don't factor out vT keep it this way:
since Av=cIv, (Av)T=(cIv)T and
factor out c to get c(Iv)T which can be transposed.
wait, never mind...I'm fixing this.
 
  • #8
Sure, but how do you conclude from that, that A^T has eigenvalue c?
 
  • #9
Oh that's right. Let me reprove this. Let c be at position j,j on A. Transposing
the matrix let's c remain on j,j. So multiplying this by
the eigenvector v gives the column matrix Icv. This gives
the same result as not transposing A in the first place, since c will
still remain on j,j no matter what, and as a result, it would wind up at position 1,j on the
column matrix. Now, since Av=ATv,
c must be in both matrices A and AT.
Take note that we know that eigenvalues are found on the diagonals of
invertible matrices, which is why j,j was chosen.
 
Last edited:
  • #10
The matrix

[tex]
\left( \begin{matrix} 1&1 \\ 1&1 \end{matrix} \right)
[/tex]

Has eigenvalues 2 and 0, neither of them are shown on the diagonal.

I would personally use the inner product to prove this.
 
Last edited:
  • #11
You could also think about using det(A)=det(A^T).
 
  • #12
Oh, one thing: I am using the Axler book as a main book, and he covers eigenvectors
before inner products, and everything before determinants, so I can't apply those,
but I do appreciate your suggestions. But given cyosis's matrix, it makes sense that
all matrices with entries that equal each other also have eigenvalues besides 0, I messed
up there because I didn't consider that.
So I'll split my proof into 3 cases. Case 1: A is an invertible matrix. Case 2: A is a matrix
where all entries are the same, Case 3: A is noninvertible and not all of its entries are the
same. Is this a good idea? I would use determinants or inner products if I knew what they were. If there is no other way, I won't do this proof until
later on when I know more about them. But thank you for the suggestions.

Take note that when I feel I need to practice a bit more I use other sources, but they vary in topic sequence.
 
Last edited:
  • #13
I find it hard to believe you are studying eigenvectors/eigenvalues yet have not been exposed to determinants. This would mean you are not capable of calculating eigenvalues of matrices yet, but your posts suggest otherwise. Could you show me perhaps how to calculate the eigenvalues of the matrix I posted earlier?
 
  • #14
You could also work around not having determinants yet. If c is an eigenvalue, then X=A-c*I is a singular matrix. Can you prove X is singular iff X^(T) is singular? How would that prove what you want to prove?
 
  • #15
Dick said:
You could also work around not having determinants yet. If c is an eigenvalue, then X=A-c*I is a singular matrix. Can you prove X is singular iff X^(T) is singular? How would that prove what you want to prove?

I can see three justifications for X to be singular. First, if A=cI, then X is a zero
matrix, and 0 is an eigenvalue for every vector, and X is singular for being
a zero matrix.

Second, if A-cI is nonzero but singular, then as long as the number of rows
the column matrix (representing a vector) equals the number of columns
in A and cI, Av would equal cIv so Av-cIv=0

Now, if A-cI were to be nonsingular, then (A-cI)v would not be zero,
so we would have Av=/=cIv so in this case v is not an eigenvector of A.

I know this isn't the actual proof, just want to see if I understand the
significance of X being singular.
 
Last edited:
  • #16
Singular doesn't mean A-cI equals 0. It means there is a nonzero vector v such that (A-cI)v=0. Which is what your second and third argument correctly say. You are overcomplicating this already. The question you should be thinking about, instead of trying to find multiple ways to prove the obvious, is why if X=A-cI is singular is X^T also singular?
 
Last edited:
  • #17
Dick said:
Singular doesn't mean A-cI equals 0. It means there is a nonzero vector v such that (A-cI)v=0. Which is what your second and third argument correctly say. You are overcomplicating this already. The question you should be thinking about, instead of trying to find multiple ways to prove the obvious, is why if X=A-cI is singular is X^T also singular?

X=A-cI a jxn matrix with j=/=n.

Consider the column with elements aj,1 to aj,n.
If j>n then (at least) this column (lets call it column A) becomes a zero column after row reducing
which results in a matrix with less columns than X so X is singular.
Now transposing causes column A to become a row,
with column A being the one at the bottom. Row reducing X^T
gives a matrix with at least one less row (column A becomes a 0 row) than X^T since t so X^T is singular.
This is a result of A being one of (or the only) the extra columns, so it becomes one of (or the only)
extra rows that get "eliminated" after row reducing.
 
Last edited:
  • #18
You are thinking in the right direction. You proved row rank=column rank, right? So in the nxn case if rank is <n then both matrices are singular. You really don't have to think about any other case. If the matrix isn't square then the concept of an 'eigenvector' doesn't exist. Why not?
 
  • #19
Dick said:
You are thinking in the right direction. You proved row rank=column rank, right? So in the nxn case if rank is <n then both matrices are singular. You really don't have to think about any other case. If the matrix isn't square then the concept of an 'eigenvector' doesn't exist. Why not?
because I doesn't exist.
 
  • #20
Dick said:
Your attempt at levity is highly appreciated. Ha. Ha. But it's actually pretty good. As "I doesn't exist" is not grammatical the idea the two vectors with different dimensions could be equal is also not grammatical.

:biggrin: I be thanks you
 
  • #21
evilpostingmong said:
because I doesn't exist.

Your attempt at levity is highly appreciated. Ha. Ha. But it's actually pretty good. As "I doesn't exist" is not grammatical as the idea that two vectors with different dimensions could be equal is also not grammatical.
 
  • #22
Too fast. You leapfrogged me before I realized I should quote you.
 
  • #23
Dick said:
Your attempt at levity is highly appreciated. Ha. Ha. But it's actually pretty good. As "I doesn't exist" is not grammatical the idea that two vectors with different dimensions could be equal is also not grammatical.

What? How did the posts switch??:eek:
 
  • #24
I didn't quote you in the previous post. Realized it was important, deleted the post. Went back quoted you in a new post. By that time you had already quoted my post. Oh, you know how these time travel things go.
 
  • #25
Dick said:
I didn't quote you in the previous post. Realized it was important, deleted the post. Went back quoted you in a new post. By that time you had already quoted my post. Oh, you know how these time travel things go.

You should post a thread in this forum explaining your time machine.
 
  • #26
That belongs in a more speculative forum. I gather you are studying this stuff on your own, right? Not attending a class?
 
  • #27
Yes, I'm not a math major, but a CS major. I think math is interesting,
but as far as classes are concerned, I went up to Calculus II.
I am studying math because I think its cool to know, for some reason.
In fact, for my major, Calc II is as high of a math course as needed.
I like to do proofs because they help reinforce understanding of the
concepts, rather than solving for x.
 
  • #28
More power to you then. Keep it up. Do they still use Knuth in CS courses? He's a mathematical challenge. I'm somehow guessing not. But you might enjoy dealing with his challenges. They are 'proofy'.
 
  • #29
Dick said:
More power to you then. Keep it up. Do they still use Knuth in CS courses? He's a mathematical challenge. I'm somehow guessing not. But you might enjoy dealing with his challenges. They are 'proofy'.

Thank you!:biggrin: I've never heard of Knuth, but tbh the only CS courses I've
taken were a couple of programming courses. Do you teach math btw?
 
  • #30
Donald Knuth. "The Art of Computer Programming". And it is art. He wrote TeX. Shame, and more shame on CS if you don't know him. No, I don't teach anything. I failed to get into academia, so I sit around all day writing engineering software and looking for interesting problems to work on. Go figure. Check out Knuth, you'll like it with your inclinations.
 
  • #31
Dick said:
Donald Knuth. "The Art of Computer Programming". And it is art. He wrote TeX. Shame, and more shame on CS if you don't know him. No, I don't teach anything. I failed to get into academia, so I sit around all day writing engineering software and looking for interesting problems to work on. Go figure. Check out Knuth, you'll like it with your inclinations.

I'll give him a try, thanks a bunch! From your posts, I'm surprised you
failed to get into academia.
 
  • #32
evilpostingmong said:
I'll give him a try, thanks a bunch! From your posts, I'm surprised you
failed to get into academia.

You try it. See how you do. Let me know.
 

1. What does it mean for two matrices to have the same eigenvalue?

When two matrices have the same eigenvalue, it means that they have at least one common eigenvector. This eigenvector is a vector that, when multiplied by the matrix, results in a scalar multiple of itself. In other words, the matrix does not change the direction of the eigenvector, only its magnitude.

2. How can you prove that two matrices have the same eigenvalue?

To prove that two matrices have the same eigenvalue, you can use the eigenvalue equation: Ax = λx, where A is the matrix, x is the eigenvector, and λ is the eigenvalue. By solving this equation for both matrices and showing that they have the same eigenvalue, you can prove that they share the same eigenvalue.

3. Is it possible for two matrices to have the same eigenvalue but not be similar?

Yes, it is possible for two matrices to have the same eigenvalue but not be similar. Similar matrices have the same eigenvalues, but the converse is not always true. Two matrices can have the same eigenvalue but have different eigenvectors, which means they are not similar.

4. Can two matrices have more than one common eigenvalue?

Yes, two matrices can have more than one common eigenvalue. This means that they have multiple eigenvectors that are shared between them. However, it is also possible for two matrices to have only one common eigenvalue.

5. Why is it important to prove that two matrices have the same eigenvalue?

Proving that two matrices have the same eigenvalue can help in understanding the relationship between them. It can also be useful in solving systems of linear equations, as eigenvectors and eigenvalues can be used to simplify the equations. Additionally, knowing that two matrices share the same eigenvalue can provide insight into the properties and behavior of the matrices.

Similar threads

  • Calculus and Beyond Homework Help
Replies
5
Views
516
  • Calculus and Beyond Homework Help
Replies
2
Views
332
  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
802
  • Calculus and Beyond Homework Help
Replies
24
Views
789
  • Calculus and Beyond Homework Help
Replies
11
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
Back
Top