What are the eigenvalues of P(A)?

In summary, the conversation is discussing how to prove that a polynomial P(lambda) is an eigenvalue of P(A) for a matrix A and a complex number lambda. The method involves showing that P(A)-muI is invertible, which implies that all (A-riI) are also invertible, and therefore lambda is an eigenvalue of A.
  • #1
chocolatefrog
12
0

Homework Statement



αo, α1,..., αd [itex]\inℝ[/itex]. Show that αo + α1λ + α2λ2 + ... + αdλd [itex]\inℝ[/itex] is an eigenvalue of αoI + α1A + α2A2 + ... + αdAd [itex]\inℝ^{nxn}[/itex].


2. The attempt at a solution

If λ is an eigenvalue of A, then |A - Iλ| = 0. Also, λn is an eigenvalue An. So we basically have to somehow prove the following equation (after rearranging):

1(A - Iλ) + α2(A2 - Iλ2) + ... + αd(Ad - Iλd)| = O

3. Relevant equations

I can't seem to get my head around this one. I almost used the triangular inequality to prove it before I realized that these are determinants we are dealing with, not absolute values. :/
 
Physics news on Phys.org
  • #2
So let [itex]P(z)[/itex] be a polynomial. We wish to prove that [itex]P(\lambda)[/itex] is an eigenvalue of [itex]P(A)[/itex].

For each [itex]\mu\in \mathbb{C}[/itex], we can write

[tex]P(z)-\mu=r_0(z-r_1)...(z-r_n)[/tex]

Thus

[tex]P(A)-\mu I = r_0(A-r_1 I)...(z-r_n I)[/tex]

So [itex]P(A)-\mu I[/itex] is invertible iff all [itex](A-r_i I)[/itex] is invertible. What does that imply for the eigenvalues?
 

What is an eigenvalue?

An eigenvalue is a scalar value that represents the magnitude of a particular eigenvector of a matrix. It is a characteristic quantity of a matrix that remains constant even when the matrix undergoes transformations.

How do you find the eigenvalues of a matrix?

To find the eigenvalues of a matrix, you must solve the characteristic equation det(A - λI) = 0, where A is the matrix and λ is the eigenvalue. This will give you a polynomial equation, the roots of which are the eigenvalues of the matrix.

What is the significance of eigenvalues in matrix operations?

Eigenvalues are important because they help us understand the behavior and properties of a matrix. They can determine whether a matrix is invertible, diagonalizable, or positive definite. Eigenvalues are also used in various applications, such as data analysis and machine learning.

Can a matrix have complex eigenvalues?

Yes, a matrix can have complex eigenvalues. This occurs when the matrix has complex entries or when the characteristic equation has complex roots. Complex eigenvalues often come in conjugate pairs, and their corresponding eigenvectors are also complex.

What is the relationship between eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are closely related. An eigenvalue is a scalar value that corresponds to a particular eigenvector of a matrix. Eigenvectors are the non-zero vectors that do not change direction when multiplied by a matrix. Together, eigenvalues and eigenvectors provide important information about the matrix and its behavior.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Advanced Physics Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
3K
  • Linear and Abstract Algebra
Replies
8
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
4K
  • Calculus and Beyond Homework Help
Replies
6
Views
4K
Back
Top