Help with Understanding Eigenvalues, Eigenvectors, and Mapping with Lamda and P

  • Thread starter Thread starter terryfields
  • Start date Start date
  • Tags Tags
    Mean
terryfields
Messages
44
Reaction score
0
this is exam revision not homework so feal free to help

let lamda be an eigenvalue of T, and let P be a polynomial with coefficients in F, define the linear mapping S=p(T) and show that p(lamda) is an eigenvalue of S

i know that an eigenvalue of T is a element vnot=0 such that T(v)=lamda v for some lamda in the field, and that the scalar lamda here is the eigenvalue, however i don't understand this question at all

so lamda is our eigenvalue meaning there must be a corresponding eigenvector v not equal to zero but what's p and what does this mapping show? please help
 
Physics news on Phys.org
First,
If A is the matrix and \lambda the eigenvector, then A \vec x = \lambda \vec x for some vector x. Note these facts.

<br /> \begin{align*}<br /> A^n \lambda &amp; = A^{(n-1)} (A \vec x) \\<br /> &amp; = A^{(n-2)} A (\lambda \vec x)\\<br /> &amp; = \hdots \\<br /> &amp; = \lambda^n \vec x<br /> \end{align*} <br />

so \lambda^n is an eigenvalue of A^n, with the same eigenvector

Next, if c is any constant, then

<br /> (cA^n) \vec x = c\left(A^n \vec x\right) = c \lambda^n \vec x<br />

so c\lambda^n is an eigenvalue of cA^n

Finally, if a, b are constants, and m, n are integers, consider the
two-term polynomial p(s) = as^m + bs^n. The polynomial p(A) is&lt;br /&gt; &lt;br /&gt; &amp;lt;br /&amp;gt; p(A) = a A^m + bA^n&amp;lt;br /&amp;gt;&lt;br /&gt; &lt;br /&gt; which is a matrix the same size as A. The product p(A) \vec x is&lt;br /&gt; &lt;br /&gt; &amp;lt;br /&amp;gt; \begin{align*}&amp;lt;br /&amp;gt; (a A^m + b A^n) \vec x &amp;amp;amp; = (a A^m) \vec x + (b A^n) \vec x \\&amp;lt;br /&amp;gt; &amp;amp;amp; = \left(a \lambda^m\right) \vec x + \left(b \lambda^n\right) \vec x\\&amp;lt;br /&amp;gt; &amp;amp;amp; = \left(a \lambda^m + b \lambda^n \right) \, \vec x \\&amp;lt;br /&amp;gt; &amp;amp;amp; = p(\lambda) \, \vec x&amp;lt;br /&amp;gt; \end{align*}&amp;lt;br /&amp;gt;&lt;br /&gt; &lt;br /&gt; That case does not have a constant term in the polynomial. If you have&lt;br /&gt; &amp;lt;br /&amp;gt; p(s) = as^m + bs^n + d&amp;lt;br /&amp;gt;&lt;br /&gt; &lt;br /&gt; where d is a constant, the appropriate modification is&lt;br /&gt; &lt;br /&gt; &amp;lt;br /&amp;gt; p(A) = aA^m + bA^n + d I_n&amp;lt;br /&amp;gt;&lt;br /&gt; &lt;br /&gt; where I_n is the identity matrix the same size as A. Again, it is easy to show that&lt;br /&gt; &lt;br /&gt; &amp;lt;br /&amp;gt; \begin{align*}&amp;lt;br /&amp;gt; p(A) \, \vec x &amp;amp;amp; = \left(a A^m + b A^n + dI_n\right) \, \vec x\\&amp;lt;br /&amp;gt; &amp;amp;amp; = \left(a \lambda^m + b \lambda^n + d\lambda\right) \, \vec x\\&amp;lt;br /&amp;gt; &amp;amp;amp; = p(\lambda) \, \vec x&amp;lt;br /&amp;gt; \end{align*}&amp;lt;br /&amp;gt;&lt;br /&gt; &lt;br /&gt; so again p(\lambda) is an eigenvalue of p(A).&lt;br /&gt; &lt;br /&gt; The case for a general polynomial requires a little more notation but the steps are the same.&lt;br /&gt; &lt;br /&gt; The idea: if A is the matrix for a linear operator, so is p(A) for&lt;br /&gt; any polynomial p, and the eigenvalues behave ``as we expect them to&amp;#039;&amp;#039;.
 
Prove $$\int\limits_0^{\sqrt2/4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx = \frac{\pi^2}{8}.$$ Let $$I = \int\limits_0^{\sqrt 2 / 4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx. \tag{1}$$ The representation integral of ##\arcsin## is $$\arcsin u = \int\limits_{0}^{1} \frac{\mathrm dt}{\sqrt{1-t^2}}, \qquad 0 \leqslant u \leqslant 1.$$ Plugging identity above into ##(1)## with ##u...
Back
Top