# Nth order linear whose auxiliary has repeated roots

1. Apr 7, 2013

### Bipolarity

Suppose I am to solve an nth order linear homogenous differential equation with constant coefficients. I set up the auxiliary equation, find its roots, and then each root gives me a solution of the form $e^{rx}$ to the ODE which is linearly independent from the others. But if there are repeated roots, I need to multiply that solution by x each time the root is repeated to obtain a fundamental set containing n elements that are linearly independent.

What assurances are there that the multiplication by x will yield a solution? And that this is linearly independent from all other solutions? Could anyone provide me an outline of the proof, or how I might prove it, or in what textbook I might find a proof?

I have not studied annihilator operators yet or algebra on polynomial differential operators, but might studying these topics perhaps make my task a bit easier?

Thanks!

BiP

2. Apr 7, 2013

### HallsofIvy

Staff Emeritus
If you have, say, a double root, a, to the characteristic equation, part of the equation, at least, must be of the form $(D- a)^2y$ ("D" is the derivative). You can then check that both $e^{ax}$ and $xe^{ax}$ are solutions: $(D- a)e^{ax}= D(e^{ax})- ae^{ax}= ae^{ax}- ae^{ax}= 0$ so $(D- a)^2 e^{ax}= (D- a)0= 0$ while $(D- a)xe^{ax}= D(xe^{ax})- axe^{ax}= e^{ax}+ axe^{ax}- axe^{ax}= e^{ax}$. And then $(D- a)^2 xe^{ax}= (D- a)e^{ax}= 0$ as before.

The fact that $xe^{ax}$ and $e^{ax}$ are "independent" follows from the definition:
if $Axe^{ax}+ Be^{ax}= 0$, then, taking x= 0, $A(0)+ B(1)= B= 0$. Then, taking x= 1, $A(e^a)= 0$ so that A= 0. Showing that $xe^{ax}$ is independent of other possible solutions, such as $e^{bx}$ is essentially the same.

3. Apr 7, 2013

### Bipolarity

I see. I am quite new to the notation for differential operators, but it is my guess that multiplication of polynomial differential operators is equivalent to composition of those operators on some function y(x), otherwise what you said would not be correct.

Also, if composition of two polynomial differential operators on some function is equivalent to multiplication of polynomials, is the composition of differential operators commutative?

Also, I'm guessing the general case could be proved by induction when there is a root of higher multiplicity in the auxiliary (characteristic) equation.

Also, does my question/your reply have anything to do with the Shift Theorem? It's not in my book, but I found it on Wikipedia, and it seemed very similar.

Thanks!

BiP

4. Apr 11, 2013

### LCKurtz

Here's another way to look at it. Call your linear equation $L(y) = 0$, where $L$ represents the linear differential operator. When you substitute $y = e^{rx}$ in that you get$$L(e^{rx}) = p(r)e^{rx}$$where $p(r)$ is the characteristic polynomial. Now suppose $r=a$ is a double root of $p(r)$, which means that $p(a) = 0$ and $p'(a)=0$. Now differentiate the above equation with respect to $r$:$$\frac\partial {\partial r}L(e^{rx})= L(\frac\partial {\partial r}e^{rx})=L(xe^{rx})=p'(r)e^{rx} + p(r)xe^{rx}$$Substitute $r = a$ into that, which gives:$$L(xe^{ax})=p'(a)e^{ax} + p(a)xe^{ax}=0$$which shows $xe^{ax}$ is a solution. Of course, switching the partial with respect to $r$ and the $L$ operator uses properties of derivatives. The argument generalizes to higher order roots.

5. Apr 12, 2013

### MostlyHarmless

So long as the coeffecients on those operators are constant. I.E: $2D=D2$ but $xD≠Dx$