General form of symmetric 3x3 matrix with only 2 eigenvalues

In summary, a symmetric 3x3 matrix with only two different eigenvalues can be described by 4 parameters: the two eigenvalues a and b, and the direction of the eigenvector of a defined by the angles θ and φ in spherical coordinates. This can also be expressed as a matrix with 6 independent parameters, or as a combination of 4 parameters through a rotation matrix. Additionally, it is possible to express two of the six parameters in terms of the other four, providing a more efficient way to fit the matrix to measured data.
  • #1
odietrich
8
2
I'm looking for the general form of a symmetric 3×3 matrix (or tensor) ##\textbf{A}## with only two different eigenvalues, i.e. of a matrix with the diagonalized form ##\textbf{D}=\begin{pmatrix}a& 0 & 0\\0 & b & 0\\0 & 0 & b\end{pmatrix} = \text{diag}(a,b,b)##.

In general, such a matrix can be described by 4 parameters, e.g. the two eigenvalues ##a,b## and the direction of the eigenvector of ##a## defined by the angles ##\theta,\phi## (in spherical coordinates). The other eigenvectors are in the plane perpendicular to this direction (with arbitrary in-plane orientation).

With these four parameters, ##(a,b,\theta,\phi)##, I can construct arbitrary matrices with the eigenvalues ##(a,b,b)## by multiplying ##\textbf{D}## with an appropriate rotation matrix ##\textbf{R}## (consisting basically of the eigenvectors): ##\textbf{A} = \textbf{R}^T \textbf{D} \textbf{R}##.

Is there any other (well-known?) form or parametrization (with 4 independent parameters) of such matrices ##\textbf{A}##? Ideally, a parametrization without (spherical) angles, but closely related to the actual matrix entries in ##\textbf{A}##?

The background of this question is that I want to find such a matrix ##\textbf{A}## by least-squares fitting to measured data that depend on ##\textbf{A}##, and up to now it seems more efficient to vary 6 independent parameters ##(s_1,\ldots,s_6)## defining a general symmetric matrix ##\textbf{S}=\begin{pmatrix}s_1&s_2&s_3\\s_2&s_4&s_5\\s_3&s_5&s_6\end{pmatrix}## than to vary e.g. the 4 parameters ##(a,b,\theta,\phi)## from above. (More efficient means that the fit converges faster - in spite of having more degrees of freedom - and typically does not run into wrong local minima which sometimes happens depending on the initial values of the angles ##\theta,\phi##.) It might help if I could express 2 of the 6 parameters ##(s_1,\ldots,s_6)## by the other 4 parameters and use these remaining 4 parameters for fitting, so I could perhaps rephrase my question: Are there two "simple" dependencies ##s_1=s_1(s_3,s_4,s_5,s_6)## and ##s_2=s_2(s_3,s_4,s_5,s_6)## (with ##s_1,\ldots,s_6## as in the form of the general symmetric matrix given above) to describe the matrix ##\textbf{A}##?

(If this is a known problem, I'd be also grateful for pointing me to any textbooks or articles dealing with it - I wasn't able to find any.)
 
Last edited:
Physics news on Phys.org
  • #2
You might want to rewrite your entire matrix as
$$
D = b I + (a-b) \begin{pmatrix}1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}.
$$
You can then use ##b## and the components of a single column vector as the parameter space. You will also have to separately treat the case of ##a-b## being negative, which can also be done with the same type of parametrisation.
 
  • Like
Likes odietrich
  • #3
Thanks for your suggestion! I think that the resulting parameters are very similar to another set of 4 numbers (that I had considered before, but didn't mention in my question above): the eigenvector of ##a## multiplied by ##a## and the eigenvalue ##b##?
I tried least-squares fitting with these (latter) parameters as well, but this didn't work better than using ##(a,b,\theta,\phi)##. Somehow, fitting based on the (first) eigenvector (either in the form ##(\theta,\phi)## or in the form of a scaled 3-component eigenvector) is considerably worse than fitting with the (non-diagonalized) symmetric matrix ##\mathbf{S}## from above.
 
  • #4
Considering Orodruin's suggestion in more detail, I found that I can write the symmetric matrix ##\textbf{S}=\begin{pmatrix}s_1&s_2&s_3\\s_2&s_4&s_5\\s_3&s_5&s_6\end{pmatrix}## as ##v\textbf{1} + (u-v) (\textbf{v}_1\otimes\textbf{v}_1) = v\textbf{1} + (u-v) \begin{pmatrix}rr&rs&rt\\rs&ss&st\\rt&st&tt\end{pmatrix}## where the (first) eigenvector be ##\textbf{v}_1=(r,s,t)^T## with ##r^2+s^2+t^2=1##, so:
$$\textbf{S} = v\textbf{1} + (u-v) \begin{pmatrix}r^2&rs&r\sqrt{1-r^2-s^2}\\rs&s^2&s\sqrt{1-r^2-s^2}\\r\sqrt{1-r^2-s^2}&s\sqrt{1-r^2-s^2}&1-r^2-s^2\end{pmatrix}$$
or (all together):
$$\begin{pmatrix}s_1&s_2&s_3\\s_2&s_4&s_5\\s_3&s_5&s_6\end{pmatrix}
= \begin{pmatrix}
v+(u-v)r^2 & (u-v)rs & (u-v)r\sqrt{1-r^2-s^2}\\
(u-v)rs & v+(u-v)s^2 & (u-v)s\sqrt{1-r^2-s^2}\\
(u-v)r\sqrt{1-r^2-s^2} & (u-v)s\sqrt{1-r^2-s^2} & v+(u-v)(1-r^2-s^2)\end{pmatrix}.$$
Thus, I have expressed the symmetric matrix ##\textbf{S}## by a matrix parametrized by ##(u,v,r,s)##.
Now, I would like to (partially) invert this and find some dependencies between the six parameters ##(s_1, \ldots, s_6)## based on this result. In theory, it must be possible to express e.g. ##s_3## and ##s_5## by the other matrix entries ##s_1, s_2, s_4, s_6##. I wonder if these dependencies can be found be staring long enough at these two matrices ...

UPDATE (just to clarify): The last paragraph basically means that I would like to express e.g. ##s_3=(u-v)r\sqrt{1-r^2-s^2}## by an appropriate combination of the terms ##s_1, s_2, s_4, s_6##, i.e. by combining ##v+(u-v)r^2##, ##(u-v)rs##, ...
 
Last edited:
  • #5
Here's a solution (took some staring:smile:):
$$s_1 = s_6 + s_3 (\frac{s_2}{s_5} - \frac{s_5}{s_2})$$
and
$$s_4 = s_6 + s_5 (\frac{s_2}{s_3} - \frac{s_3}{s_2}).$$
With this solution, I can express ##s_1## and ##s_4## by the matrix entries ##s_2, s_3, s_5##, and ##s_6##.
So, if the symmetric matrix ##\textbf{S}## has only two different eigenvalues ##(a,b,b)## and, thus, can be constructed as ##\textbf{S}=\textbf{R}^T \textrm{diag}(a,b,b)\textbf{R}##, then the elements of ##\textbf{S}## (indexed as defined in posting #1) satisfy the two given equations (of course, the off-diagonal elements of ##\textbf{S}## should not be zero ...)
 

What is a symmetric 3x3 matrix?

A symmetric 3x3 matrix is a square matrix with 3 rows and 3 columns in which the entries are equal to their corresponding entries reflected across the main diagonal. This means that the matrix is equal to its own transpose.

What does it mean to have only 2 eigenvalues in a symmetric 3x3 matrix?

In a symmetric 3x3 matrix, having only 2 eigenvalues means that there are only 2 distinct values that can be used to scale the eigenvectors of the matrix. This means that there are only 2 possible directions in which the matrix can stretch or compress vectors.

How can you determine the general form of a symmetric 3x3 matrix with only 2 eigenvalues?

The general form of a symmetric 3x3 matrix with only 2 eigenvalues can be determined using the spectral theorem. This theorem states that a symmetric matrix can be diagonalized by a unitary matrix, and the diagonal entries will be the eigenvalues of the original matrix.

Why is it important to understand the general form of a symmetric 3x3 matrix with only 2 eigenvalues?

Understanding the general form of a symmetric 3x3 matrix with only 2 eigenvalues is important because it allows us to solve for the eigenvalues and eigenvectors of the matrix. This information can then be used to analyze the properties and behavior of the matrix in various mathematical and scientific applications.

Can a symmetric 3x3 matrix have more than 2 eigenvalues?

No, a symmetric 3x3 matrix can only have 2 eigenvalues. This is because the spectral theorem states that a symmetric matrix can be diagonalized by a unitary matrix, and the number of eigenvalues is equal to the number of dimensions of the matrix. Since a symmetric 3x3 matrix has 3 dimensions, it can only have 3 eigenvalues, but 2 of them must be equal in order for the matrix to be symmetric.

Similar threads

  • Linear and Abstract Algebra
Replies
12
Views
1K
Replies
7
Views
839
  • Linear and Abstract Algebra
Replies
1
Views
1K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
929
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
18
Views
2K
Replies
4
Views
3K
Replies
8
Views
22K
  • Linear and Abstract Algebra
Replies
6
Views
4K
Back
Top