Why do eigenvectors stay the same when a matrix is squared?

In summary, the conversation discusses the concept of eigenvectors and whether or not matrix A and A2 have the same eigenvectors. The main argument is that while it is true that A and A2 share the same eigenvectors, it is not a proof of why this is the case. The conversation also explores different ways of proving this concept, including using equations and examples. Ultimately, it is concluded that while an eigenvector of A is also an eigenvector of A2, the converse is not always true.
  • #1
Aldnoahz
37
1
I am new to linear algebra but I have been trying to figure out this question. Everybody seems to take for granted that for matrix A which has eigenvectors x, A2 also has the same eigenvectors?

I know that people are just operating on the equation Ax=λx, saying that A2x=A(Ax)=A(λx) and therefore A2x = λ2x. However, in my opinion, this is not a proof proving why A2 and A have the same eigenvectors but rather why λ is squared on the basis that the matrices share the same eigenvectors.

If someone can prove that A2 and A have the same eigenvectors by using equations A2y=αy and Ax=λx, and proceeding to prove y=x, I will be very much convinced that these two matrices have the same eigenvectors.

Or are there any other convincing proofs to show this result?
 
Physics news on Phys.org
  • #2
You already have said everything that can be said here. I don't know why you don't regard it as a proof.
There is no way of proving ##A^2y=\alpha y \; \wedge \; Ax= \lambda x \; \Rightarrow \; x=y## because it is not true.
E.g. ##x## and ##y## can simply be two different (linear independent) eigenvectors of ##A##.

What you can prove is ##Ay=\alpha y \; \Rightarrow \; A^2y=\alpha^2 y## which is done by the equation you posted.
 
  • #3
Aldnoahz said:
therefore A2x = λ2x.
Let B = A2 and α = λ2. Then Bx = αx. That is the definition of x being an eigenvector of B.
 
  • #4
Yeah I think I had some problems with my logic. Now I understand. Thank you.
 
  • #5
i am not sure what you have concluded but it is not true that A^2 has the same eigenvectors as A, since it can have more. E.g. take D the derivative acting on polynomials of degree ≤ one. Then D^2 = 0 and thus has x as an eigenvector, since D^2x = 0, but D does not since Dx = 1. Of course an eigenvector of A is also an eigenvector of A^2, "trivially", as proved above, but the converse is false.
 
Last edited:
  • Like
Likes FactChecker

1. Why do eigenvectors stay the same when a matrix is squared?

When a matrix is squared, the resulting matrix represents a transformation that is applied twice. The eigenvectors of a matrix represent directions in which the transformation only stretches or compresses, without changing direction. Therefore, when a matrix is squared, the eigenvectors remain unchanged because they still represent the same stretching or compressing directions.

2. How do eigenvectors relate to matrix squaring?

Eigenvectors are important in matrix squaring because they provide a way to understand the behavior of the resulting matrix. Since eigenvectors represent the directions that are unchanged by the transformation, they can help us understand how the matrix will act when it is applied multiple times.

3. Can a matrix have different eigenvectors when squared?

Yes, a matrix can have different eigenvectors when squared. This is because the squaring operation can change the behavior of the transformation, and therefore, the directions that are unchanged by the transformation may also change. However, it is possible for some eigenvectors to remain the same even after squaring.

4. Do all matrices have eigenvectors that stay the same when squared?

No, not all matrices have eigenvectors that stay the same when squared. This depends on the specific properties of the matrix and the transformation it represents. In some cases, the eigenvalues and eigenvectors may change when a matrix is squared.

5. What is the significance of eigenvectors staying the same after matrix squaring?

The fact that eigenvectors stay the same after matrix squaring is significant because it allows us to simplify complex transformations. By understanding the behavior of the matrix when it is applied multiple times, we can make predictions and solve problems more easily. This property also has important applications in fields such as physics, engineering, and computer science.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
3K
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
4K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
10K
  • Linear and Abstract Algebra
Replies
6
Views
4K
Replies
2
Views
5K
Back
Top