Math: Linear Alegebra Related Question

  • Thread starter surferdude89
  • Start date
  • Tags
    Linear
In summary, the conversation discusses the concept of polynomial matrices and whether they have the same eigenvalues and eigenvectors as the original matrix. It is shown that if a vector is an eigenvector of both matrices, then it is also an eigenvector of a linear combination of those matrices. The conversation also mentions the use of LaTeX on the board and provides resources for further understanding.
  • #1
surferdude89
1
0

Homework Statement



Hello All,

This question is a problem I ran across and I am working on for practice, but I am having a rough time getting started because of not understanding the context of the problem, Some help would be greatly appreciated in understanding the question.


Suppose $\mathbf{A}$ is an $n \times n$ matrix with (not necessarily distinct) eigenvalues $\lambda_{1}, \lambda_{2}, \ldots, \lambda_{n}$. Can it be shown that the *polynomial matrix*

$p( \mathbf{A} ) = k_{m} \mathbf{A}^{m}+k_{m-1} \mathbf{A}^{m-1}+\ldots+k_{1} \mathbf{A} +k_{0} \mathbf{I} $


has the eigenvalues


$p(\lambda_{j}) = k_{m}{\lambda_{j}}^{m}+k_{m-1}{\lambda_{j}}^{m-1}+\ldots+k_{1}\lambda_{j}+k_{0}$


where $j = 1,2,\ldots,n$ and the same eigenvectors as $\mathbf{A}$.

Thank You.


Homework Equations





The Attempt at a Solution



If $X$ is an eigenvector of $A$, say $AX=\lambda X$, then we can use that to simplify $p(A)X$ into $(\text{some scalar value})*X$, and that scalar in front of the $X$ is then an eigenvalue of $p(A)$, corresponding to the eigenvector $X$.

This is in LaTeX format of writting. I am not sure this site supports it. Let's see.
 
Physics news on Phys.org
  • #2
i had to write this out to read it, click on the latex to see the code

so A is an nxn matrix, with pontentially non-disctinct eigenvalues [itex] \lambda_i [/itex]

can it by shown that polynomial martix
[tex] p( \mathbf{A} ) = k_{m} \mathbf{A}^{m}+k_{m-1} \mathbf{A}^{m-1}+\ldots+k_{1} \mathbf{A} +k_{0} \mathbf{I}[/tex]
 
  • #3
surferdude89 said:
If $X$ is an eigenvector of $A$, say $AX=\lambda X$, then we can use that to simplify $p(A)X$ into $(\text{some scalar value})*X$, and that scalar in front of the $X$ is then an eigenvalue of $p(A)$, corresponding to the eigenvector $X$.

that sounds reasonable to me and shows x is an eignevector of A and p(A),

that only bit that gets me is why non-distinct eigenvalues are mentioned... i think it still works but you need to assume you have a full set of eigenvectors.

though having a non-full set of eignevectors, would mean it has the same eigenvalues for any eigenvector, you may just need it is defective the same as the original matrix, so it doesn't have any other eigenvalues... not too sure on this..
 
  • #5
On this board, use [ itex ] to begin and [ /itex ] to end LaTex (without the spaces)
What you posted was:

Suppose [itex]\mathbf{A}[/itex] is an [itex]n \times n[/itex] matrix with (not necessarily distinct) eigenvalues [itex]\lambda_{1}, \lambda_{2}, \ldots, \lambda_{n}[/itex]. Can it be shown that the *polynomial matrix*

[itex]p( \mathbf{A} ) = k_{m} \mathbf{A}^{m}+k_{m-1} \mathbf{A}^{m-1}+\ldots+k_{1} \mathbf{A} +k_{0} \mathbf{I} [/itex]


has the eigenvalues


[itex]p(\lambda_{j}) = k_{m}{\lambda_{j}}^{m}+k_{m-1}{\lambda_{j}}^{m-1}+\ldots+k_{1}\lambda_{j}+k_{0}[/itex]


where [itex]j = 1,2,\ldots,n[/itex] and the same eigenvectors as [itex]\mathbf{A}[/itex].

Yes, that's true. It is easy to show, by induction, say, that if v is an eigenvalue of A with eigenvector [itex]\lambda[/itex], then [itex]A^nv= \lambda^n v[/itex]. It can be shown by direct computation that if v is an eigenvector of both A and B with eigenvalues [itex]\lambda_A[/itex] and [itex]\lambda_B[/itex], respectively, then v is an eigenvector of pA+ qB with eigenvalues [itex]p\lambda_A+ q\lambda B[/itex] where p and q are any scalars.

I recommend that you try proving those two things yourself.
 
Last edited by a moderator:

1. What is linear algebra?

Linear algebra is a branch of mathematics that deals with linear equations and their representations in vector spaces. It involves the study of linear transformations and their properties, as well as operations on matrices and vectors.

2. What are the basic concepts of linear algebra?

The basic concepts of linear algebra include vector spaces, linear transformations, matrices, determinants, and eigenvalues and eigenvectors. These concepts are used to solve systems of linear equations, perform operations on vectors and matrices, and analyze geometric transformations.

3. How is linear algebra used in real life?

Linear algebra has numerous applications in real life, such as in computer graphics, data analysis, engineering, physics, economics, and statistics. It is also used in machine learning and artificial intelligence algorithms, as well as in solving optimization problems.

4. What are the benefits of learning linear algebra?

Learning linear algebra can improve problem-solving skills, analytical thinking, and mathematical reasoning. It also provides a foundation for more advanced mathematical topics and has many practical applications in various fields.

5. What are some resources for learning linear algebra?

There are many resources available for learning linear algebra, including textbooks, online courses, video lectures, and practice problems. Some popular resources include "Linear Algebra" by Gilbert Strang, Khan Academy's linear algebra course, and MIT's OpenCourseWare lectures on linear algebra.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
935
  • Differential Equations
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
710
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
Replies
6
Views
566
  • Calculus and Beyond Homework Help
Replies
0
Views
128
  • Topology and Analysis
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
991
  • Calculus and Beyond Homework Help
Replies
2
Views
511
  • Atomic and Condensed Matter
Replies
0
Views
453
Back
Top