# Cosine of a 2 by 2 matrix

1. Nov 24, 2012

### snesnerd

I have to find the cosine of the following:

| 1 0 |
| 2 2 |

This is a 2 by 2 matrix. I have been reading my linear algebra book for awhile now, and theirs no information at all on how do to something like this. Not sure how to find the cosine of a matrix. Could not really find any pages online either.

2. Nov 24, 2012

### haruspex

The cosine of a matrix is defined by extending the power series definition of a cosine to taking a matrix as an argument. Can do the same with exponentials, of course.

3. Nov 24, 2012

### lurflurf

There are many ways one of which is to factor your matrix

$$\begin{bmatrix} 1 & 0\\ 2 & 2 \end{bmatrix}=\begin{bmatrix} -1 & 0\\ 2 & 1 \end{bmatrix}\begin{bmatrix} 1 & 0\\ 0 & 2 \end{bmatrix}\begin{bmatrix} -1 & 0\\ 2 & 1 \end{bmatrix}$$

The cosine is then easy to find

$$\cos \begin{bmatrix} 1 & 0\\ 2 & 2 \end{bmatrix}=\begin{bmatrix} -1 & 0\\ 2 & 1 \end{bmatrix}\begin{bmatrix} \cos(1) & 0\\ 0 & \cos(2) \end{bmatrix}\begin{bmatrix} -1 & 0\\ 2 & 1 \end{bmatrix}$$

4. Nov 24, 2012

### snesnerd

How did you factor the matrix that way?

5. Nov 24, 2012

### Dick

lurflurf found the eigenvalues and eigenvectors and diagonalized the matrix.

6. Nov 24, 2012

### Fredrik

Staff Emeritus
Note that you need to use what haruspex said to understand why lurflurf's calculation makes sense.

7. Nov 25, 2012

### Ray Vickson

Besides what has already been suggested, here is another way: for an nxn matrix A whose eigenvalues $r_1, r_2, \ldots, r_n$ are all distinct, there exist matrices $E_1, E_2, \ldots, E_n$ such that for any analytic function f(x) we have
$$f(A) = E_1 f(r_1) + E_2 f(r_2) + \cdots + E_n f(r_n),$$ In your case you have two eigenvalues (1 and 2), so f(A) = E1*f(1) + E2*f(2). You can easily determine E1 and E2 by applying this to the two functions f(x) = 1 = x^0 (giving f(A) = I, the identity matrix) and f(x) = x (giving f(A) = A). That is we have
$$I = E_1 \, 1^0 + E_2 \, 2^0 = E_1 + E_2,\\ A = E_1 \, 1 + E_2 \, 2 = E_1 + 2E_2,$$ and solving gives
$$E_1 = \pmatrix{1&0\\-2&0}, \: E_2 = \pmatrix{0&0\\2&1}.$$ Thus,
$$\cos(A) = E_1 \, \cos(1) + E_2 \, \cos(2).$$

Note that we have f(A) = E1*f(1) + E2*f(2), so, for example, we have lots of other results, like
$$A^{100} = E_1 1^{100} + E_2 2^{100}\\ A^n = E_1 + E_2 2^n \\ e^{At} = E_1 e^{1t} + E_2 e^{2t}, \: \text{ etc.}$$

RGV

8. Nov 25, 2012

### snesnerd

Hmm everything seems to make sense except one thing. I do not get lurflurfs results. My eigenvalues are 1 and 2, and when I solve I get:

|-1| when lambda is 1
| 2|

|0| when lambda is 2
|0|

So I get:

| -1 0 | which is what Ray Vickson got.
| 2 0 |

However when I solve D = P^-1 A P, I just get a matrix with zeros for D which is not the correct answer. Other then that, all the other reasoning makes sense to me.

9. Nov 25, 2012

### Dick

The zero vector is never an eigenvector. There are nonzero eigenvectors corresponding to the eigenvalue 2.