# Distinct eigenvalues problem

1. May 15, 2009

### gtfitzpatrick

The real matrix A= $$\begin{pmatrix}\alpha & \beta \\ 1 & 0 \end{pmatrix}$$ has distinct eigenvalues $$\lambda1$$ and $$\lambda2$$.
If P=$$\begin{pmatrix}\lambda1 & \lambda2 \\ 1 & 0 \end{pmatrix}$$

proove that P$$^{}^-^1$$AP = D =diag{$$\lambda1$$ , $$\lambda2$$}.

deduce that, for every positive integer m, A$$^{}m$$ = PD$$^{}m$$P$$^{}^-^1)$$

so i just tryed to multiply the whole lot out, (p^-1 is easy to find, just swap,change signs)
and i got
$$\begin{pmatrix}\lambda1(\alpha - \lambda2) + \beta & \lambda2(\alpha - \lambda2) + \beta \\ \lambda1(-\alpha + \lambda1) - \beta & \lambda2(-\alpha + \lambda1) - \beta \end{pmatrix}$$

am i going the right road with this or should i be approaching it differently?

2. May 15, 2009

### HallsofIvy

Re: Eigenvalues

Just doing the calculation should show that, but I don't get that for the calculation.

If
$$P= \begin{bmatrix}\lambda_1 & \lambda_2 \\ 1 & 0\end{bmatrix}$$
then
$$P^{-1}= \begin{bmatrix}0 & 1 \\ \frac{1}{\lambda_1} & -\frac{\lambda_1}{\lambda_2}\end{bmatrix}$$

Is that what you got?

You will also want to use the fact that the characteristic equation for A is $x^2- \alpha x- \beta= 0$ so $\lambda_1^2- \alpha \lambda_1- \beta= 0$ and $\lambda_2^2- \alpha \lambda_2- \beta= 0$.

3. May 15, 2009

### Horse

Re: Eigenvalues

Perhaps, something like:

$$\lambda_{1} = \frac{\alpha \pm \sqrt{\alpha^{2}+4\beta}}{2}$$

$$\lambda_{2} = \frac{\alpha \pm \sqrt{\alpha^{2}+4\beta}}{2}$$

not sure though how it will solve the problem.

4. May 15, 2009

### Random Variable

Re: Eigenvalues

Couldn't you prove it by showing that the columns of P are the eigenvectors of A?

5. May 15, 2009

### HallsofIvy

Re: Eigenvalues

Yes, if you already have that theorem. But the obvious way to do it is to just do the product $P^{-1}AP$.

6. May 15, 2009

### gtfitzpatrick

Re: Eigenvalues

Thanks for all the replies. i had it wrong when getting p^-1 i only went and forgot 1/detA!!!

so for P^1AP i get $$= \begin{bmatrix}\lambda_1 & \lambda_1 \\ \alpha-\lambda_1^2/\lambda_2+\alpha/\lambda_1 & \alpha - \lambda_1^2/\lambda_2\end{bmatrix}$$

I thought diag($$\lambda_1$$,$$\lambda_2$$) = $$D= \begin{bmatrix}\lambda_1 & 0 \\ 0 & \lambda_2 \end{bmatrix}$$

i still think im doing something wrong
confussed!

7. May 18, 2009

### gtfitzpatrick

Re: Eigenvalues

After repeatedly trying and not getting anywhere im after realising i had the question wrong P should read $$\begin{pmatrix}\lambda1 & \lambda2 \\ 1 & 1 \end{pmatrix}$$

So im off the try this new version.

But if i wanted to prove it like you said random variable how would i go about that?
take
$\lambda_1^2- \alpha \lambda_1- \beta= 0 and \lambda_2^2- \alpha \lambda_2- \beta= 0$ and let them = columns of p?

8. May 19, 2009

### gtfitzpatrick

Re: Eigenvalues

right so now i got 1/($$\lambda_1-\lambda_2$$) $$\begin{bmatrix}\lambda_1(\alpha - \lambda _2 ) + \beta & \lambda_2(\alpha - \lambda_2) + \beta \\ \lambda_1(-\alpha + \lambda_1) - \beta & \lambda_2(-\alpha + \lambda_1) - \beta\end{bmatrix}$$

not sure where to go from here...