Math: Linear Alegebra Related Question

  • Thread starter Thread starter surferdude89
  • Start date Start date
  • Tags Tags
    Linear
Click For Summary

Homework Help Overview

The discussion revolves around a linear algebra problem concerning an \( n \times n \) matrix \( \mathbf{A} \) and its eigenvalues. The original poster seeks clarification on whether a polynomial matrix \( p(\mathbf{A}) \) can be shown to have eigenvalues that are derived from the eigenvalues of \( \mathbf{A} \) while maintaining the same eigenvectors.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the relationship between the eigenvalues of \( \mathbf{A} \) and the polynomial matrix \( p(\mathbf{A}) \). There is an attempt to simplify \( p(A)X \) using the properties of eigenvectors. Some participants question the relevance of non-distinct eigenvalues and whether a full set of eigenvectors is necessary for the argument to hold.

Discussion Status

The discussion is ongoing, with participants exploring the implications of the properties of eigenvalues and eigenvectors. Some guidance has been provided regarding the relationship between eigenvalues and polynomial matrices, but there is no explicit consensus on the necessity of having a full set of eigenvectors or the implications of non-distinct eigenvalues.

Contextual Notes

There is mention of potential constraints regarding the completeness of eigenvectors and the implications of having non-distinct eigenvalues. Participants are also navigating the formatting of mathematical expressions in LaTeX.

surferdude89
Messages
1
Reaction score
0

Homework Statement



Hello All,

This question is a problem I ran across and I am working on for practice, but I am having a rough time getting started because of not understanding the context of the problem, Some help would be greatly appreciated in understanding the question.


Suppose $\mathbf{A}$ is an $n \times n$ matrix with (not necessarily distinct) eigenvalues $\lambda_{1}, \lambda_{2}, \ldots, \lambda_{n}$. Can it be shown that the *polynomial matrix*

$p( \mathbf{A} ) = k_{m} \mathbf{A}^{m}+k_{m-1} \mathbf{A}^{m-1}+\ldots+k_{1} \mathbf{A} +k_{0} \mathbf{I} $


has the eigenvalues


$p(\lambda_{j}) = k_{m}{\lambda_{j}}^{m}+k_{m-1}{\lambda_{j}}^{m-1}+\ldots+k_{1}\lambda_{j}+k_{0}$


where $j = 1,2,\ldots,n$ and the same eigenvectors as $\mathbf{A}$.

Thank You.


Homework Equations





The Attempt at a Solution



If $X$ is an eigenvector of $A$, say $AX=\lambda X$, then we can use that to simplify $p(A)X$ into $(\text{some scalar value})*X$, and that scalar in front of the $X$ is then an eigenvalue of $p(A)$, corresponding to the eigenvector $X$.

This is in LaTeX format of writting. I am not sure this site supports it. Let's see.
 
Physics news on Phys.org
i had to write this out to read it, click on the latex to see the code

so A is an nxn matrix, with pontentially non-disctinct eigenvalues \lambda_i

can it by shown that polynomial martix
p( \mathbf{A} ) = k_{m} \mathbf{A}^{m}+k_{m-1} \mathbf{A}^{m-1}+\ldots+k_{1} \mathbf{A} +k_{0} \mathbf{I}
 
surferdude89 said:
If $X$ is an eigenvector of $A$, say $AX=\lambda X$, then we can use that to simplify $p(A)X$ into $(\text{some scalar value})*X$, and that scalar in front of the $X$ is then an eigenvalue of $p(A)$, corresponding to the eigenvector $X$.

that sounds reasonable to me and shows x is an eignevector of A and p(A),

that only bit that gets me is why non-distinct eigenvalues are mentioned... i think it still works but you need to assume you have a full set of eigenvectors.

though having a non-full set of eignevectors, would mean it has the same eigenvalues for any eigenvector, you may just need it is defective the same as the original matrix, so it doesn't have any other eigenvalues... not too sure on this..
 
On this board, use [ itex ] to begin and [ /itex ] to end LaTex (without the spaces)
What you posted was:

Suppose \mathbf{A} is an n \times n matrix with (not necessarily distinct) eigenvalues \lambda_{1}, \lambda_{2}, \ldots, \lambda_{n}. Can it be shown that the *polynomial matrix*

p( \mathbf{A} ) = k_{m} \mathbf{A}^{m}+k_{m-1} \mathbf{A}^{m-1}+\ldots+k_{1} \mathbf{A} +k_{0} \mathbf{I}


has the eigenvalues


p(\lambda_{j}) = k_{m}{\lambda_{j}}^{m}+k_{m-1}{\lambda_{j}}^{m-1}+\ldots+k_{1}\lambda_{j}+k_{0}


where j = 1,2,\ldots,n and the same eigenvectors as \mathbf{A}.

Yes, that's true. It is easy to show, by induction, say, that if v is an eigenvalue of A with eigenvector \lambda, then A^nv= \lambda^n v. It can be shown by direct computation that if v is an eigenvector of both A and B with eigenvalues \lambda_A and \lambda_B, respectively, then v is an eigenvector of pA+ qB with eigenvalues p\lambda_A+ q\lambda B where p and q are any scalars.

I recommend that you try proving those two things yourself.
 
Last edited by a moderator:

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
4K
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
Replies
1
Views
2K
Replies
9
Views
2K
Replies
0
Views
1K