# From system of first-order to a single ODE

1. Feb 19, 2012

### rsq_a

Is there an easy way to show that the system:

\begin{align} x_1' &= p_{11} x_1 + p_{12} x_2 + \ldots + p_{1n} x_n \\ x_2' &= p_{21} x_1 + p_{22} x_2 + \ldots + p_{2n} x_n \\ \ldots &= \ldots \\ x_n' &= p_{n1} x_1 + p_{n2} x_2 + \ldots + p_{nn} x_n \end{align}

must be equivalent to a single nth order differential equation, like
$$a_n y^{(n)} + a_{n-1} y^{(n-1)} + \ldots + a_0 y_n = 0$$

All the $$p_{ij} = p_{ij}(t)$$ and $$a_i = a_i(t)$$. In the case that n = 2, it's easy to show just by manipulation. I assume that it's true in general, but I can't find a slick way to do it.

2. Feb 20, 2012

### bigfooted

Interesting... I never thought about it before, usually you want to reduce the nth order ode to the system and not the other way around. An nth order ode is always equivalent to a system of n first order odes. But is a system of n first order odes always equivalent to a nth order ode?

The eigenvalues of the nxn matrix of first order equations give you the n roots of the characteristic equation of the equivalent nth order ode. And the n roots of the characteristic equation of an nth order ode give you the n eigenvalues of the equivalent system.
So if the matrix leads to n eigenvalues, it gives you an equivalent nth order ode. Otherwise, the system is not equivalent to an nth order ode.

3. Feb 21, 2012

### rsq_a

I think in the case of a constant coefficient ODE, it's perhaps a bit easier to reason (??)

Here is one argument just by counting unknowns. Maybe the first non-trivial example is with 3 dependent variables:

\begin{align} x_1' &= p_{11} x_1 + p_{12} x_2 + p_{13} x_3 \\ x_2' &= p_{21} x_1 + p_{22} x_2 + p_{23} x_3 \\ x_3' &= p_{31} x_1 + p_{32} x_2 + p_{33} x_3 \end{align}

where p = p(t). Differentiating the third equation twice, we get
$$x_3'' = f(x_1, x_1', x_1'', x_2, x_2', x_2'', x_3, x_3', x_3'')$$

There are thus 6 unknowns (from x1 and x2 and their two derivatives). If you look at the above system, you immediately have two equations for x1 and x2, and you get another 4 by differentiating both equation twice. Thus 6 equations and 6 unknowns. So in some cases, you can solve for {x1, x1', x1'', x2, x2', x2''} in terms of x3, and that gives you an uncoupled equation for x3.

Can someone give some thoughts on where this might fail, and what that means for the eventual high order equation in x3?

4. Feb 21, 2012

### kai_sikorski

Seems to me that if you differentiate twice it's
$$x_3''' = f(x_1, x_1', x_1'', x_2, x_2', x_2'', x_3, x_3', x_3'')$$
Well if you differentiate twice here you'll get for example an equation for $x_1'''$, so you'll be able to solve for {x1, x1', x1'', x2, x2', x2''} , but the expressions will involve $x_1'''$ and $x_2'''$. I think I vaguely remember finding out at some point that this inversion can't be done in general, but I can't remember how to construct a counter example.

5. Feb 21, 2012

### kai_sikorski

Also, why is the 2x2 case trivial?

6. Feb 21, 2012

### rsq_a

...I've no idea why I thought it was a few days ago. You're right, I was speaking nonsense.