What are the complex eigenvalues and eigenvectors of a 2x2 rotation matrix?

In summary, the 2x2 matrix representing a rotation of the xy-plane has no real eigenvalues except for certain special angles, which are θ = nπ for all integers n. This is due to the fact that no vector in the plane is carried into itself under such a rotation, unlike in three dimensions where vectors on the rotation axis are eigenvectors. The matrix does have complex eigenvalues and eigenvectors, which can be found by solving the characteristic equation and substituting into the eigenvalue equation. A matrix S can be constructed to diagonalize the original matrix T, and performing the similarity transformation explicitly results in a diagonal matrix with the eigenvalues on the diagonal.
  • #1
spaghetti3451
1,344
33

Homework Statement



The 2x2 matrix representing a rotation of the xy-plane is T =

cos θ -sin θ
sin θ cos θ

Show that (except for certain special angles - what are they?) this matrix has no real eigenvalues. (This reflects the geometrical fact that no vector in the plane is carried into itself under such a rotation; contrast rotations in three dimensions.) This matrix does, however, have complex eigenvalues and eigenvectors. Find them. Construct a matrix S which diagonalises T. Perform the similarity transformation (STS-1) explicitly, and show hat it reduces T to diagonal form.

Homework Equations



The Attempt at a Solution



The characteristic equation is [itex][cos(θ) - \lambda]^{2}+ [sin^{2}θ] = 0[/itex], that is,
[itex]\lambda^{2} - 2 \lambda cos(θ) + 1 = 0[/itex]. The discriminant for this equation is 2i sin(θ). This means that the matrix has real eigenvalues iff θ = nπ for all integers n.


The geometrical interpretation is that in the xy-plane, any arbitrary vector does not get mapped onto its original direction unless there is no rotation (θ = 0), the rotation is half a revolution (θ = ±180°), the rotation is a full revolution (θ = ±360°), ... . The vector (0,0) has a special status in this transformation as it gets mapped onto itself, however, by definition, (0,0) is not an eigenvector.

In three dimensions, again, any arbitrary vector does not get mapped onto its original direction unless there is no rotation (θ = 0), the rotation is half a revolution (θ = ±180°), the rotation is a full revolution (θ = ±360°), ... . The rotation is about a specified axis passing through the origin (the axis has to pass through the origin, otherwise the transformaion is not linear), and all the vectors that live in the axis have a special status in this transformation as they get mapped onto themselves, however, by definition, (0,0) is not an eigenvector, so all the vectors in the axis except for (0,0) are eigenvectors. This is in contrast to the vectors in a plane as such vectors do not exist in that case.



The eigenvalues are [itex]\lambda = e^{±iθ}[/itex]. Substitute [itex]\lambda = e^{iθ}[/itex] and eigenvector = {x,y} into the eigenvalue equation to obtain two equations which can be solved to obtain y = -ix. Then, substitute [itex]\lambda = e^{-iθ}[/itex] and eigenvector = {x,y} into the eigenvalue equation to obtain two equations which can be solved to obtain y = ix. So, the eigenvalues are [itex]e^{iθ}[/itex] and [itex]e^{-iθ}[/itex] and the corresponding eigenvectors are {1, -i} and {1, i}.



The matrix S-1 is given by
1 1
-i i
From this, the matrix S is found to be
1/2 i/2
1/2 -i/2

Performing the similarity transformation (STS-1) explicitly results in the matrix
[itex]e^{iθ}[/itex] 0
0 [itex]e^{-iθ}[/itex].



I would greatly appreciate any comments on the solution.
 
Physics news on Phys.org
  • #2
hi failexam! :wink:

yes that's fine :smile:, except …

i] in 3D, there's an extra row and column, with a 1 and 0s, so there's an extra eigenvalue of 1 (and the question only asks about a plane anyway :wink:)

ii] you could divide by detS, so that both S and S-1 have 1/√2s, which would make the S and S-1 pair look more symmetric (but you don't have to, since the question doesn't ask for it)
 

What is an eigenvalue matrix problem?

An eigenvalue matrix problem is a mathematical problem that involves finding the eigenvalues and eigenvectors of a given square matrix. It is commonly used in fields such as linear algebra, physics, and engineering to analyze and solve systems of equations.

How do you find the eigenvalues of a matrix?

To find the eigenvalues of a matrix, you must first calculate the determinant of the matrix. Then, you must solve for the values of lambda that make the determinant equal to zero. These values of lambda are the eigenvalues of the matrix.

What is the significance of eigenvalues in matrix problems?

Eigenvalues are important in matrix problems because they represent the scaling factor for the corresponding eigenvector. This allows for easier and more efficient calculations and solutions to complex systems of equations.

How do you find the eigenvectors of a matrix?

To find the eigenvectors of a matrix, you must first solve for the eigenvalues. Then, you can plug each eigenvalue into the original matrix and solve for the corresponding eigenvector using Gaussian elimination or other methods.

What are some real-world applications of eigenvalue matrix problems?

Eigenvalue matrix problems have many practical applications, such as in computer graphics, quantum mechanics, and population dynamics. They are also used in data analysis and machine learning algorithms to identify patterns and relationships in large datasets.

Similar threads

  • Calculus and Beyond Homework Help
Replies
5
Views
533
  • Calculus and Beyond Homework Help
Replies
2
Views
399
  • Calculus and Beyond Homework Help
Replies
2
Views
533
  • Calculus and Beyond Homework Help
Replies
2
Views
100
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
312
  • Calculus and Beyond Homework Help
Replies
19
Views
3K
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
Back
Top