- #1
odietrich
- 8
- 2
I'm looking for the general form of a symmetric 3×3 matrix (or tensor) ##\textbf{A}## with only two different eigenvalues, i.e. of a matrix with the diagonalized form ##\textbf{D}=\begin{pmatrix}a& 0 & 0\\0 & b & 0\\0 & 0 & b\end{pmatrix} = \text{diag}(a,b,b)##.
In general, such a matrix can be described by 4 parameters, e.g. the two eigenvalues ##a,b## and the direction of the eigenvector of ##a## defined by the angles ##\theta,\phi## (in spherical coordinates). The other eigenvectors are in the plane perpendicular to this direction (with arbitrary in-plane orientation).
With these four parameters, ##(a,b,\theta,\phi)##, I can construct arbitrary matrices with the eigenvalues ##(a,b,b)## by multiplying ##\textbf{D}## with an appropriate rotation matrix ##\textbf{R}## (consisting basically of the eigenvectors): ##\textbf{A} = \textbf{R}^T \textbf{D} \textbf{R}##.
Is there any other (well-known?) form or parametrization (with 4 independent parameters) of such matrices ##\textbf{A}##? Ideally, a parametrization without (spherical) angles, but closely related to the actual matrix entries in ##\textbf{A}##?
The background of this question is that I want to find such a matrix ##\textbf{A}## by least-squares fitting to measured data that depend on ##\textbf{A}##, and up to now it seems more efficient to vary 6 independent parameters ##(s_1,\ldots,s_6)## defining a general symmetric matrix ##\textbf{S}=\begin{pmatrix}s_1&s_2&s_3\\s_2&s_4&s_5\\s_3&s_5&s_6\end{pmatrix}## than to vary e.g. the 4 parameters ##(a,b,\theta,\phi)## from above. (More efficient means that the fit converges faster - in spite of having more degrees of freedom - and typically does not run into wrong local minima which sometimes happens depending on the initial values of the angles ##\theta,\phi##.) It might help if I could express 2 of the 6 parameters ##(s_1,\ldots,s_6)## by the other 4 parameters and use these remaining 4 parameters for fitting, so I could perhaps rephrase my question: Are there two "simple" dependencies ##s_1=s_1(s_3,s_4,s_5,s_6)## and ##s_2=s_2(s_3,s_4,s_5,s_6)## (with ##s_1,\ldots,s_6## as in the form of the general symmetric matrix given above) to describe the matrix ##\textbf{A}##?
(If this is a known problem, I'd be also grateful for pointing me to any textbooks or articles dealing with it - I wasn't able to find any.)
In general, such a matrix can be described by 4 parameters, e.g. the two eigenvalues ##a,b## and the direction of the eigenvector of ##a## defined by the angles ##\theta,\phi## (in spherical coordinates). The other eigenvectors are in the plane perpendicular to this direction (with arbitrary in-plane orientation).
With these four parameters, ##(a,b,\theta,\phi)##, I can construct arbitrary matrices with the eigenvalues ##(a,b,b)## by multiplying ##\textbf{D}## with an appropriate rotation matrix ##\textbf{R}## (consisting basically of the eigenvectors): ##\textbf{A} = \textbf{R}^T \textbf{D} \textbf{R}##.
Is there any other (well-known?) form or parametrization (with 4 independent parameters) of such matrices ##\textbf{A}##? Ideally, a parametrization without (spherical) angles, but closely related to the actual matrix entries in ##\textbf{A}##?
The background of this question is that I want to find such a matrix ##\textbf{A}## by least-squares fitting to measured data that depend on ##\textbf{A}##, and up to now it seems more efficient to vary 6 independent parameters ##(s_1,\ldots,s_6)## defining a general symmetric matrix ##\textbf{S}=\begin{pmatrix}s_1&s_2&s_3\\s_2&s_4&s_5\\s_3&s_5&s_6\end{pmatrix}## than to vary e.g. the 4 parameters ##(a,b,\theta,\phi)## from above. (More efficient means that the fit converges faster - in spite of having more degrees of freedom - and typically does not run into wrong local minima which sometimes happens depending on the initial values of the angles ##\theta,\phi##.) It might help if I could express 2 of the 6 parameters ##(s_1,\ldots,s_6)## by the other 4 parameters and use these remaining 4 parameters for fitting, so I could perhaps rephrase my question: Are there two "simple" dependencies ##s_1=s_1(s_3,s_4,s_5,s_6)## and ##s_2=s_2(s_3,s_4,s_5,s_6)## (with ##s_1,\ldots,s_6## as in the form of the general symmetric matrix given above) to describe the matrix ##\textbf{A}##?
(If this is a known problem, I'd be also grateful for pointing me to any textbooks or articles dealing with it - I wasn't able to find any.)
Last edited: