What is the Cosine of a 2 by 2 Matrix?

  • Thread starter Thread starter snesnerd
  • Start date Start date
  • Tags Tags
    Cosine Matrix
Click For Summary

Homework Help Overview

The discussion revolves around finding the cosine of a 2 by 2 matrix, specifically the matrix | 1 0 | | 2 2 |. Participants express uncertainty about the methods available for calculating the cosine of a matrix, referencing linear algebra concepts.

Discussion Character

  • Exploratory, Assumption checking, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the extension of the power series definition of cosine to matrices. Some suggest factoring the matrix and diagonalization as potential approaches. Questions arise regarding the methods used to factor the matrix and the validity of eigenvalues and eigenvectors in this context.

Discussion Status

The discussion is active, with various methods being explored. Some participants have provided insights into matrix diagonalization and the relationship between eigenvalues and the cosine function. However, there is no explicit consensus on the correct approach, and participants continue to question and clarify their understanding of the calculations involved.

Contextual Notes

There is mention of distinct eigenvalues and the need for nonzero eigenvectors, indicating that assumptions about the matrix's properties are being scrutinized. Participants also note difficulties in applying certain methods, such as the diagonalization process, which may be contributing to confusion.

snesnerd
Messages
26
Reaction score
0
I have to find the cosine of the following:

| 1 0 |
| 2 2 |

This is a 2 by 2 matrix. I have been reading my linear algebra book for awhile now, and theirs no information at all on how do to something like this. Not sure how to find the cosine of a matrix. Could not really find any pages online either.
 
Physics news on Phys.org
The cosine of a matrix is defined by extending the power series definition of a cosine to taking a matrix as an argument. Can do the same with exponentials, of course.
 
There are many ways one of which is to factor your matrix

[tex]\begin{bmatrix} <br /> 1 & 0\\ <br /> 2 & 2 <br /> \end{bmatrix}=\begin{bmatrix} <br /> -1 & 0\\ <br /> 2 & 1 <br /> \end{bmatrix}\begin{bmatrix} <br /> 1 & 0\\ <br /> 0 & 2 <br /> \end{bmatrix}\begin{bmatrix} <br /> -1 & 0\\ <br /> 2 & 1 <br /> \end{bmatrix}[/tex]

The cosine is then easy to find

[tex]\cos \begin{bmatrix} <br /> 1 & 0\\ <br /> 2 & 2 <br /> \end{bmatrix}=\begin{bmatrix} <br /> -1 & 0\\ <br /> 2 & 1 <br /> \end{bmatrix}\begin{bmatrix} <br /> \cos(1) & 0\\ <br /> 0 & \cos(2) <br /> \end{bmatrix}\begin{bmatrix} <br /> -1 & 0\\ <br /> 2 & 1 <br /> \end{bmatrix}[/tex]
 
How did you factor the matrix that way?
 
snesnerd said:
How did you factor the matrix that way?

lurflurf found the eigenvalues and eigenvectors and diagonalized the matrix.
 
Note that you need to use what haruspex said to understand why lurflurf's calculation makes sense.
 
snesnerd said:
I have to find the cosine of the following:

| 1 0 |
| 2 2 |

This is a 2 by 2 matrix. I have been reading my linear algebra book for awhile now, and theirs no information at all on how do to something like this. Not sure how to find the cosine of a matrix. Could not really find any pages online either.

Besides what has already been suggested, here is another way: for an nxn matrix A whose eigenvalues ##r_1, r_2, \ldots, r_n## are all distinct, there exist matrices ##E_1, E_2, \ldots, E_n## such that for any analytic function f(x) we have
[tex]f(A) = E_1 f(r_1) + E_2 f(r_2) + \cdots + E_n f(r_n),[/tex] In your case you have two eigenvalues (1 and 2), so f(A) = E1*f(1) + E2*f(2). You can easily determine E1 and E2 by applying this to the two functions f(x) = 1 = x^0 (giving f(A) = I, the identity matrix) and f(x) = x (giving f(A) = A). That is we have
[tex]I = E_1 \, 1^0 + E_2 \, 2^0 = E_1 + E_2,\\<br /> A = E_1 \, 1 + E_2 \, 2 = E_1 + 2E_2,[/tex] and solving gives
[tex]E_1 = \pmatrix{1&0\\-2&0}, \: E_2 = \pmatrix{0&0\\2&1}.[/tex] Thus,
[tex]\cos(A) = E_1 \, \cos(1) + E_2 \, \cos(2).[/tex]

Note that we have f(A) = E1*f(1) + E2*f(2), so, for example, we have lots of other results, like
[tex]A^{100} = E_1 1^{100} + E_2 2^{100}\\<br /> A^n = E_1 + E_2 2^n \\<br /> e^{At} = E_1 e^{1t} + E_2 e^{2t}, \: \text{ etc.}[/tex]

RGV
 
Hmm everything seems to make sense except one thing. I do not get lurflurfs results. My eigenvalues are 1 and 2, and when I solve I get:

|-1| when lambda is 1
| 2|

|0| when lambda is 2
|0|

So I get:

| -1 0 | which is what Ray Vickson got.
| 2 0 |

However when I solve D = P^-1 A P, I just get a matrix with zeros for D which is not the correct answer. Other then that, all the other reasoning makes sense to me.
 
snesnerd said:
Hmm everything seems to make sense except one thing. I do not get lurflurfs results. My eigenvalues are 1 and 2, and when I solve I get:

|-1| when lambda is 1
| 2|

|0| when lambda is 2
|0|

So I get:

| -1 0 | which is what Ray Vickson got.
| 2 0 |

However when I solve D = P^-1 A P, I just get a matrix with zeros for D which is not the correct answer. Other then that, all the other reasoning makes sense to me.

The zero vector is never an eigenvector. There are nonzero eigenvectors corresponding to the eigenvalue 2.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
Replies
8
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
Replies
2
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 25 ·
Replies
25
Views
4K
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K