Using diagonalization, prove the matrix equals it's square

Click For Summary

Homework Help Overview

The discussion revolves around proving that a 2x2 matrix A, with eigenvalues 0 and 1, satisfies the equation A² = A using diagonalization. Participants are exploring the implications of the matrix's eigenvalues and the existence of a diagonalizing matrix.

Discussion Character

  • Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • Participants discuss the necessity of finding a matrix X that diagonalizes A, questioning whether the specific form of X is required. There is also exploration of the relationship between distinct eigenvalues and the existence of linearly independent eigenvectors.

Discussion Status

Some participants have provided guidance on the existence of the diagonalizing matrix and the independence of eigenvectors corresponding to distinct eigenvalues. However, there remains confusion regarding the proof of linear independence and the interpretation of independence in the context of eigenvalues.

Contextual Notes

Participants are grappling with the definitions and implications of linear independence in relation to eigenvectors and eigenvalues, particularly in the context of the matrix's diagonalization.

PirateFan308
Messages
91
Reaction score
0

Homework Statement


Suppose that A is a 2x2 matrix with eigenvalues 0 and 1. Using diagonalization, show that A2 = A


The Attempt at a Solution


Let [tex]A=\begin{pmatrix}a&b\\c&d\end{pmatrix}[/tex]

Av=λv where [tex]v=\begin{pmatrix}x\\y\end{pmatrix}[/tex] and x,y≠0
If λ=0 then [tex]ax+by=0[/tex] and [tex]cx+dy=0[/tex]
If λ=1 then [tex]ax+by=1[/tex] and [tex]cx+dy=1[/tex]

so Av-λv=0, then Av-λIv=0, then (A-λI)v=0. Since v≠0, then (A-λI)=0
so for λ=0 [tex]\begin{pmatrix}a&b\\c&d\end{pmatrix}[/tex] and [tex]ax+by=0[/tex] and [tex]cx+dy=0[/tex]
For λ=1 [tex]\begin{pmatrix}a-1&b\\c&d-1\end{pmatrix}[/tex] and [tex]ax-x+by=1[/tex] and [tex]cx+dy-y=1[/tex]

We must find two lin. ind. vectors such that we can create X where the first column of X is the first vector, and the second column of X is the second vector.

[tex]X^{-1}AX= \begin{pmatrix}0&0\\0&1\end{pmatrix}[/tex]

If this is true, then [tex]X^{-1}A^{2}X= \begin{pmatrix}0&0\\0&1\end{pmatrix}[/tex]

The problem is, I'm not quite sure how to prove any of this
 
Physics news on Phys.org
You don't need to specifically find your matrix X. Only knowing that the matrix X exists would suffice. So the only thing you need is that there exists a matrix X such that

[tex]XAX^{-1}[/tex]

is diagonal. You don't need the specific form of X.
 
micromass said:
You don't need to specifically find your matrix X. Only knowing that the matrix X exists would suffice. So the only thing you need is that there exists a matrix X such that

[tex]XAX^{-1}[/tex]

is diagonal. You don't need the specific form of X.

In order for the matrix X to exist, it must be invertible (it must be nxn) and since it must be able to multiply by A (2x2), X must also be 2x2.

Since there are two distinct eigenvalues, there must be two linearly independent eigenvectors. These two distinct eigenvectors can for X.

I'm still a bit confused on how to prove exactly why there must be two linearly independent eigenvectors.
 
PirateFan308 said:
In order for the matrix X to exist, it must be invertible (it must be nxn) and since it must be able to multiply by A (2x2), X must also be 2x2.

Since there are two distinct eigenvalues, there must be two linearly independent eigenvectors. These two distinct eigenvectors can for X.

I'm still a bit confused on how to prove exactly why there must be two linearly independent eigenvectors.

You can prove this directly. If v and w are eigenvectors belonging to distinct eigenvalues, then v and w are independent. Thus if [itex]Av=\lambda v[/itex] and if [itex]Aw=\mu w[/itex] and if [itex]\lambda =\mu[/itex], then v and w are independent.

Try to prove this. Prove it by contradiction. Assume that v and w are dependent. What do you know then??
 
micromass said:
You can prove this directly. If v and w are eigenvectors belonging to distinct eigenvalues, then v and w are independent. Thus if [itex]Av=\lambda v[/itex] and if [itex]Aw=\mu w[/itex] and if [itex]\lambda =\mu[/itex], then v and w are independent.
You mean "if [itex]\lambda\ne \mu[/itex]".

Try to prove this. Prove it by contradiction. Assume that v and w are dependent. What do you know then??
 
micromass said:
Try to prove this. Prove it by contradiction. Assume that v and w are dependent. What do you know then??

If v and w are dependent, then w=cv which could be put into [itex]Av=\lambda v[/itex] and [itex]Aw=\mu cv[/itex] so [itex]\lambda =\mu c[/itex] and they are dependent. But 0 and 1 are independent, so this is a contradiction.
 
PirateFan308 said:
If v and w are dependent, then w=cv which could be put into [itex]Av=\lambda v[/itex] and [itex]Aw=\mu cv[/itex] so [itex]\lambda =\mu c[/itex] and they are dependent. But 0 and 1 are independent, so this is a contradiction.

What do you mean, 0 and 1 are independent?? 0 and 1 are numbers, not vectors. Saying that they are independent makes no sense.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 12 ·
Replies
12
Views
3K
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K