Zero-Trace Symmetric Matrix is Orthogonally Similar to A Zero-Diagonal Matrix.

Click For Summary

Discussion Overview

The discussion revolves around the properties of symmetric matrices, specifically focusing on whether a symmetric matrix with a trace of zero can be orthogonally transformed into a matrix with zeros on the diagonal. The scope includes theoretical aspects of linear algebra and matrix transformations.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants assert that any symmetric $n \times n$ matrix with real entries and a trace of zero can be orthogonally transformed into a matrix with zeros on the diagonal, referencing the spectral theorem.
  • Others challenge this claim by providing counterexamples, such as a specific $3 \times 3$ matrix that does not meet the criteria, suggesting that additional conditions like positive semi-definiteness may be necessary.
  • A participant proposes a method to find an orthogonal transformation that can convert a symmetric matrix to one with a zero in a specific diagonal position, using the intermediate value theorem to argue for the existence of such a transformation.
  • There is a discussion about the implications of positive semi-definiteness, with some participants noting that while it relates to eigenvalues, it may not be relevant to the original problem.

Areas of Agreement / Disagreement

Participants do not reach a consensus. Some support the original claim regarding orthogonal similarity, while others provide counterexamples and argue for the necessity of additional conditions, leading to an unresolved debate.

Contextual Notes

Limitations include the potential dependence on the definitions of orthogonal similarity and positive semi-definiteness, as well as unresolved mathematical steps in the proposed proofs.

caffeinemachine
Gold Member
MHB
Messages
799
Reaction score
15
Hello MHB.

During my Mechanics of Solids course in my Mechanical Engineering curriculum I came across a certain fact about $3\times 3$ matrices.

It said that any symmetric $3\times 3$ matrix $A$ (with real entries) whose trace is zero is orthogonally similar to a matrix $B$ which has only zeroes on the diagonal.

In other words, given a symmetric matrix $A$ with $\text{trace}(A)=0$, there exists an orthogonal matrix $Q$ such that $QAQ^{-1}$ has only zeroes on the diagonal.

I think the above fact should be true not only for $3\times 3$ matrices but for matrices with any dimension.

So what I am trying to prove is that:

Problem: Given a symmetric $n\times n$ matrix with real entries with $\text{trace}(A)=0$, there exists an orthogonal matrix $Q$ such that $QAQ^{-1}$ has only zeroes on the diagonal.

I have tried to attack the problem using the spectral theorem.
Since $A$ is symmetric, we know that there exists an orthogonal matrix $S$ such that $D=SAS^{-1}$ is a diagonal matrix.
We need to show that $D$ is orthogonally similar to a matrix with only zeroes on the diagonal.
Thus we have to find an orthogonal matrix $Q$ such that $QDQ^{-1}$ has only zeroes on the diagonal.
This is equivalent to show that $\sum_{k=1}^n q^2_{ik}d_k=0$ for all $i\in \{1,\ldots,n \}$, where $q_{ij}$ is the $i,j$-th entry of $Q$ and $d_k$ is the $k$-th diagonal entry of $D$.
We also know that $d_1+\ldots+d_n=0$.
Here I am stuck.
From the above it can be seen that proposition is true for $n=2$. From $n=3$ I have taken the fact from the book but cannot easily prove it. Can anybody help.
 
Physics news on Phys.org
caffeinemachine said:
Hello MHB.

During my Mechanics of Solids course in my Mechanical Engineering curriculum I came across a certain fact about $3\times 3$ matrices.

It said that any symmetric $3\times 3$ matrix $A$ (with real entries) whose trace is zero is orthogonally similar to a matrix $B$ which has only zeroes on the diagonal.

In other words, given a symmetric matrix $A$ with $\text{trace}(A)=0$, there exists an orthogonal matrix $Q$ such that $QAQ^{-1}$ has only zeroes on the diagonal.

I think the above fact should be true not only for $3\times 3$ matrices but for matrices with any dimension.

So what I am trying to prove is that:

Problem: Given a symmetric $n\times n$ matrix with real entries with $\text{trace}(A)=0$, there exists an orthogonal matrix $Q$ such that $QAQ^{-1}$ has only zeroes on the diagonal.

I have tried to attack the problem using the spectral theorem.
Since $A$ is symmetric, we know that there exists an orthogonal matrix $S$ such that $D=SAS^{-1}$ is a diagonal matrix.
We need to show that $D$ is orthogonally similar to a matrix with only zeroes on the diagonal.
Thus we have to find an orthogonal matrix $Q$ such that $QDQ^{-1}$ has only zeroes on the diagonal.
This is equivalent to show that $\sum_{k=1}^n q^2_{ik}d_k=0$ for all $i\in \{1,\ldots,n \}$, where $q_{ij}$ is the $i,j$-th entry of $Q$ and $d_k$ is the $k$-th diagonal entry of $D$.
We also know that $d_1+\ldots+d_n=0$.
Here I am stuck.
From the above it can be seen that proposition is true for $n=2$. From $n=3$ I have taken the fact from the book but cannot easily prove it. Can anybody help.

It's not true.
Counter example:
$$A = \begin{bmatrix}
-1 & 0 & 0 \\
0 & 0 & 0 \\
0 & 0 & +1
\end{bmatrix}$$

Additionally, you need for instance that the matrix is positive semi-definite.
 
I like Serena said:
It's not true.
Counter example:
$$A = \begin{bmatrix}
-1 & 0 & 0 \\
0 & 0 & 0 \\
0 & 0 & +1
\end{bmatrix}$$

Additionally, you need for instance that the matrix is positive semi-definite.

Thank you ILS for participating.

Put $$Q=
\begin{bmatrix}
1/\sqrt{2} & 0 & 1/\sqrt{2}\\
0& 1& 0\\
-1/\sqrt{2}& 0 &1/\sqrt{2}
\end{bmatrix}$$

Then $$QAQ^{-1}=\begin{bmatrix}
0 & 0 & 1\\
0 & 0 & 0\\
1 & 0 & 0
\end{bmatrix}$$

So this doesn't serve as a counterexample. May be there is another one.

Anyway. Can you provide a proof (or hints) for the additional hypothesis of positive semi-definitivity? Does this result have a name?
 
Let $A = (a_{ij})$ be an $n\times n$ symmetric matrix with trace zero. The first step is to find an orthogonal transformation that converts $A$ to a matrix with a zero on the $(1,1)$-position on the diagonal. If $a_{11}=0$ there is nothing to prove. Otherwise, choose $j>1$ such that $a_{jj}$ has the opposite sign to $a_{11}$. Such a $j$ must exist because the diagonal entries sum to zero. Now let $P_\theta$ be the orthogonal matrix given by rotating the $1$ and $j$ coordinates through an angle $\theta$ and leaving all the other coordinates alone. Specifically, the $2\times2$ submatrix of $P_\theta$ consisting of rows and columns $1$ and $j$ looks like $\begin{bmatrix}\cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix}$, $P_\theta$ has $1$s in all the other diagonal places, and zeros everywhere else. You can check that the $(1,1)$-element of $P_\theta AP_\theta^{-1}$ is $a_{11}\cos^2\theta - 2a_{1j}\cos\theta\sin\theta + a_{jj}\sin^2\theta$. When $\theta=0$ this is $a_{11}$. When $\theta=\pi/2$ it is $a_{jj}$ which has the opposite sign to $a_{11}$. By the intermediate value theorem there must be some value of $\theta$ for which this element is $0$. For that value of $\theta$ $$ P_\theta AP_\theta^{-1} = \begin{bmatrix} 0& v \\ w & B \end{bmatrix},$$ where $v$ is a row vector, $w$ is a column vector (each with $n-1$ elements) and $B$ is a symmetric $(n-1)\times(n-1)$ matrix with trace $0$ (because $P_\theta AP_\theta^{-1}$ has the same trace as $A$).

Now proceed inductively. By the same process as above, you can successively find orthogonal transformations that convert $A$ to a matrix with increasingly many zeros down the diagonal. At the end, you will find that the final two diagonal elements $a_{(n-1)(n-1)}$ and $a_{nn}$ are negatives of each other and you can find an orthogonal transformation converting both of them to $0$.
 
Last edited:
caffeinemachine said:
So this doesn't serve as a counterexample. May be there is another one.

Good point.
I was mixing it up with diagonalizability, which is not the point here.

Anyway, Opalg[/color] has already given a proof.
Anyway. Can you provide a proof (or hints) for the additional hypothesis of positive semi-definitivity? Does this result have a name?

The additional condition of positive semi-definitivity means that all eigenvalues are non-negative.
Since the trace is the sum of the eigenvalues, it follows that a trace of zero implies that all eigenvalues are 0.
Still, this turns out to be irrelevant for your problem.
 
Opalg said:
Let $A = (a_{ij})$ be an $n\times n$ symmetric matrix with trace zero. The first step is to find an orthogonal transformation that converts $A$ to a matrix with a zero on the $(1,1)$-position on the diagonal. If $a_{11}=0$ there is nothing to prove. Otherwise, choose $j>1$ such that $a_{jj}$ has the opposite sign to $a_{11}$. Such a $j$ must exist because the diagonal entries sum to zero. Now let $P_\theta$ be the orthogonal matrix given by rotating the $1$ and $j$ coordinates through an angle $\theta$ and leaving all the other coordinates alone. Specifically, the $2\times2$ submatrix of $P_\theta$ consisting of rows and columns $1$ and $j$ looks like $\begin{bmatrix}\cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix}$, $P_\theta$ has $1$s in all the other diagonal places, and zeros everywhere else. You can check that the $(1,1)$-element of $P_\theta AP_\theta^{-1}$ is $a_{11}\cos^2\theta - 2a_{1j}\cos\theta\sin\theta + a_{jj}\sin^2\theta$. When $\theta=0$ this is $a_{11}$. When $\theta=\pi/2$ it is $a_{jj}$ which has the opposite sign to $a_{11}$. By the intermediate value theorem there must be some value of $\theta$ for which this element is $0$. For that value of $\theta$ $$ P_\theta AP_\theta^{-1} = \begin{bmatrix} 0& v \\ w & B \end{bmatrix},$$ where $v$ is a row vector, $w$ is a column vector (each with $n-1$ elements) and $B$ is a symmetric $(n-1)\times(n-1)$ matrix with trace $0$ (because $P_\theta AP_\theta^{-1}$ has the same trace as $A$).

Now proceed inductively. By the same process as above, you can successively find orthogonal transformations that convert $A$ to a matrix with increasingly many zeros down the diagonal. At the end, you will find that the final two diagonal elements $a_{(n-1)(n-1)}$ and $a_{nn}$ are negatives of each other and you can find an orthogonal transformation converting both of them to $0$.
Thank You!
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 4 ·
Replies
4
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K