MHB Prove that this matrix equation has no roots

  • Thread starter Thread starter Fernando Revilla
  • Start date Start date
  • Tags Tags
    Matrix Roots
Fernando Revilla
Gold Member
MHB
Messages
631
Reaction score
0
I quote an unsolved problem from another forum.

Could you explain to me how to solve "more sophisticated" matrix equations such as this one?

Prove that 2x^2 + x = \begin{bmatrix} -1&5&3\\-2&1&2\\0&-4&-3\end{bmatrix} has no solutions in M(3,3;R), where M(3,3;R) is the space of all matrices 3x3 with real entries.

The characteristic polynomial of the given matrix $M$ is $\chi (\lambda)=-\lambda^3-3\lambda^2-17\lambda-11$. The derivative $\chi'(\lambda)=-2\lambda^2-6\lambda-17$ has no real roots and $\chi'(0)<0$, so $\chi'(\lambda)<0$ for all $\lambda\in\mathbb{R}$ which means that $\chi$ is strictly decreasing in $\mathbb{R}$.

On the other hand, $\chi(-1)=4>0$ and $\chi(-1/2)=-25/8<0$. According to Bolzano's theorem, $\chi$ has a root $\beta\in (-1,-1/2)$. We conclude that $\beta$ is the only real eigenvalue of $M$.

Suppose that there exists $X\in\mathbb{R}^{3\times 3}$ such that $2X^2+X=M$. Let $\alpha$ be a real eigenvalue of $X$ (there is as least one because 3 is odd), then $2\alpha^2+\alpha$ is a real eigenvalue of $2X^2+X$ and one of those $\alpha$ must verify $2\alpha^2+\alpha=\beta$.

But $f(\alpha)=2\alpha^2+\alpha-\beta$ has an absolute minimum at $\alpha=-1/4$ and $f(-1/4)=-1/8-\beta>0$ which is a contradicction. So, the given equation has no solution.
 
Last edited:
Physics news on Phys.org
Nice!

Fernando Revilla said:
then $2\alpha^2+\alpha$ is a real eigenvalue of $2X^2+X$ and one of those $\alpha$ must verify $2\alpha^2+\alpha=\beta$.

Suppose z is an imaginary eigenvalue of X.
Then $2z^2+z$ is an eigenvalue of $2X^2+X$.
This eigenvalue could be real, couldn't it?
 
ILikeSerena said:
Nice! Suppose z is an imaginary eigenvalue of X.
Then $2z^2+z$ is an eigenvalue of $2X^2+X$.
This eigenvalue could be real, couldn't it?

Yes a priori, but in such case there would be two linearly independent vectors $v_1,v_2\in\mathbb{R}^3$ such that:

$(2X^2+X)v_1=(2\alpha^2+\alpha)v_1=\beta v_1$
$(2X^2+X)v_2=(2z^2+z)v_2=\beta v_2$

Then, $\beta$ would be an eigenvalue of $M$ at least double. Contradiction.
 
Last edited:
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

Replies
24
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
880
  • · Replies 15 ·
Replies
15
Views
1K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
8
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 11 ·
Replies
11
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K