Eigenvalue/Eigenvector Proof for Matrix Polynomial p(A)

  • Thread starter Thread starter cookie91
  • Start date Start date
  • Tags Tags
    Proof
Click For Summary

Homework Help Overview

The discussion revolves around proving a property of eigenvalues and eigenvectors in the context of matrix polynomials. The original poster is tasked with showing that if \((\lambda, x)\) is an eigenpair of matrix \(A\), then \((p(\lambda), x)\) is an eigenpair of the matrix polynomial \(p(A)\), where \(p(x)\) is defined as a summation involving coefficients and powers of \(x\).

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • The original poster attempts to relate the eigenvalue equation \(Ax = \lambda x\) to the polynomial expression \(p(A)\). They consider multiplying the polynomial form by the eigenvector and express \(p(A)x\) in terms of \(A^i\) applied to \(x\). Some participants suggest examining the results of this multiplication to clarify the relationship.

Discussion Status

The discussion is ongoing, with participants exploring different approaches to the proof. The original poster expresses uncertainty about whether their reasoning effectively demonstrates the required relationship between \(x\) and \(\lambda\) in the context of the matrix \(A\). Guidance has been offered regarding the multiplication of the polynomial by the eigenvector.

Contextual Notes

The original poster is grappling with the implications of their calculations and how to formally establish that the eigenvector \(x\) and eigenvalue \(\lambda\) pertain specifically to the matrix \(A\). There is a focus on ensuring the correctness of the eigenpair relationship in the context of the polynomial transformation.

cookie91
Messages
5
Reaction score
0

Homework Statement



let p(x) = summation(from i=0 to k) aix^i

matrix polynomial for A is defined as p(A) = summation(i=0 up to k) aiA^i

Show that if (lambda, x) is an eigenpair of A then (p(lambda), x) is an eigenpair of p(A)

Homework Equations





The Attempt at a Solution



I pretty much have no idea where to start. I thought I could use Ax = lambda x like you would if you were proving that lambda^2 is an eigenvalue of A^2, etc, but I'm not sure how to get the p(A) bit?
 
Physics news on Phys.org
try multiplying the polynomial matrix form by the eignvector & see the results
 
here's some latex to help, click on it to see code

[tex]\sum_{i=0}^k a_i x^i[/tex]
 
Thanks, I'm still not sure if I'm going about this the right way

so I've got

Use Ax = [tex]\lambda[/tex]x

for p(A), multiply both sides of matrix polynomial by eigenvector x

p(A)x =
[tex] \sum_{i=0}^k a_i A^i [/tex] x

from the polynomial in the question, p(x) =
[tex] \sum_{i=0}^k a_i x^i [/tex]

let x = [tex]\lambda[/tex]

p([tex]\lambda[/tex]) =
[tex] \sum_{i=0}^k a_i \lambda^i [/tex]

p(A)x = [tex] \sum_{i=0}^k a_i \lambda^i [/tex] x

p(A)x = p([tex]\lambda[/tex]x as required

but I'm not sure if I've really shown anything. how do i show that its the same x and [tex]\lambda[/tex] that belong to the matrix A?
 

Similar threads

  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 6 ·
Replies
6
Views
1K
Replies
5
Views
2K
Replies
11
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K