Determining Unitarily Equivalent Matrices

  • Thread starter Thread starter Shackleford
  • Start date Start date
  • Tags Tags
    Equivalent
Click For Summary

Homework Help Overview

The discussion revolves around determining whether two matrices are unitarily equivalent, focusing on the definitions and properties related to eigenvalues and eigenvectors. The matrices in question are presented in a specific form, and participants explore the implications of their characteristics.

Discussion Character

  • Conceptual clarification, Assumption checking, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the definitions of unitary equivalence, including the roles of eigenvalues and eigenvectors. There are inquiries about efficient methods for determining equivalence and the relationship between eigenvalues and unitary equivalence.

Discussion Status

The discussion is active, with participants offering differing views on the necessity of eigenvectors for establishing unitary equivalence. Some suggest that having the same eigenvalues is sufficient, while others emphasize the importance of corresponding eigenvectors and the dimensions of eigenspaces.

Contextual Notes

There is mention of specific matrices and their eigenvalues, as well as a reference to the tedious nature of finding eigenvectors. Participants also note the constraints of their definitions and assumptions regarding eigenvalues and eigenvectors.

Shackleford
Messages
1,649
Reaction score
2
I've had the flu all week.

Of course, the book defines unitary equivalent, but it doesn't talk about an efficient method of determining if two matrices are unitarily equivalent.

If A and B are similar and tr(B*B) = tr(A*A), then A and B are unitarily equivalent.

If A and B are normal matrices and have the same eigenvalues, then they are unitarily equivalent.

Is there an efficient way to determine if these matrices are unitarily equivalent?

<br /> \begin{bmatrix}<br /> 0 &amp; 1 &amp; 0\\ <br /> -1 &amp; 0 &amp;0 \\ <br /> 0 &amp;0 &amp;1 <br /> \end{bmatrix}

<br /> \begin{bmatrix}<br /> 1 &amp; 0 &amp; 0\\ <br /> 0 &amp; i &amp;0 \\ <br /> 0 &amp;0 &amp;-i <br /> \end{bmatrix}
 
Physics news on Phys.org
You can easily find the eigenvalues, no?
 
micromass said:
You can easily find the eigenvalues, no?

How does that relate to unitary equivalence?
 
I found that A and B are unitarily equivalent if they have the same sets of eigenvalues, counting multiplicity.

A = P*BP (unitarily equivalent)

det(A) = det(P*BP) = det(P*)det(B)det(P) = det(P*)det(P)det(B) = det(B)
det(A) = det(B)

Their characteristic polynomials must be equal.
 
Shackleford said:
I found that A and B are unitarily equivalent if they have the same sets of eigenvalues, counting multiplicity.
No, that is not true. The matrices
A= \begin{bmatrix}1 &amp; 0 \\ 0 &amp; 1\end{bmatrix}
and
B= \begin{bmatrix}1 &amp; 1 \\ 0 &amp; 1\end{bmatrix}
have the same eigenvalues (1 with multiplicity two) but are not unitarily equivalent because they do not have the same eigenvectors. A has every vector as eigenvector while B has only multiples of <1, 0> as eigenvectors.

Two matrices are "unitarily equivalent" if and only if they have the same eigenvalues and the same corresponding eigenvectors.

A = P*BP (unitarily equivalent)

det(A) = det(P*BP) = det(P*)det(B)det(P) = det(P*)det(P)det(B) = det(B)
det(A) = det(B)

Their characteristic polynomials must be equal.
 
HallsofIvy said:
No, that is not true. The matrices
A= \begin{bmatrix}1 &amp; 0 \\ 0 &amp; 1\end{bmatrix}
and
B= \begin{bmatrix}1 &amp; 1 \\ 0 &amp; 1\end{bmatrix}
have the same eigenvalues (1 with multiplicity two) but are not unitarily equivalent because they do not have the same eigenvectors. A has every vector as eigenvector while B has only multiples of <1, 0> as eigenvectors.

Two matrices are "unitarily equivalent" if and only if they have the same eigenvalues and the same corresponding eigenvectors.

Okay, so I found the eigenvalues of each of the matrices: 1, -i, +i. Now I have the tedious job of finding the eigenvectors. -_-
 
Shackleford said:
Okay, so I found the eigenvalues of each of the matrices: 1, -i, +i. Now I have the tedious job of finding the eigenvectors. -_-

Halls is definitely wrong to say that they have to have the same eigenvectors. You just have to have the same number of linearly independent eigenvectors for every eigenvalue. You have three distinct eigenvalues. That means you don't have to compute the eigenvectors. Why?
 
Last edited:
Dick said:
Halls is definitely wrong to say that they have to have the same eigenvectors. You just have to have the same number of linearly independent eigenvectors for every eigenvalue. You have three distinct eigenvalues. That mean you don't have to compute the eigenvectors. Why?

Ah, you're right. The dimensions of the eigenspaces are equal - 3.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
Replies
2
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 19 ·
Replies
19
Views
4K
Replies
14
Views
2K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 22 ·
Replies
22
Views
2K