How do we write a sinusoidal solution to a 2nd order DE as a sum of exponentials raised to complex roots?

Click For Summary
The discussion focuses on deriving a sinusoidal solution for a second-order differential equation with complex roots when the discriminant is negative (Δ < 0). The general solution is expressed as y(x) = e^(-ax/2)(c1 sin(kx) + c2 cos(kx)), where k = √(-Δ)/2. It is noted that this solution can be represented in a complex exponential form, linking real and imaginary parts to the original equation. The conversation also touches on the relationship between oscillatory solutions and the nature of the coefficients in the differential equation, emphasizing that complex solutions yield real-valued results when combined appropriately. The conclusion highlights the importance of understanding the interplay between real and imaginary components in these solutions.
zenterix
Messages
774
Reaction score
84
Homework Statement
I have a doubt that has bugged me for a long time about sinusoidal solutions to linear 2nd order differential equations with constant coefficients.
Relevant Equations
I can solve differential equations, but I would like to understand how we get from a real solution of a 2nd order linear differential equation with constant coefficients to a form (which I imagine is a complex solution) that has exponentials raised to exponents that are the complex roots of the characteristic polynomial.

Can this be done?
Consider the differential equation

$$y''+ay'+by=0$$

We have analytical solutions for this equation.

There are three cases to consider based on the discriminant of the characteristic polynomial associated with the equation.

$$\Delta=a^2-4b$$

I just want to discuss the case where $$\Delta <0$$.

It can be shown that if we define ##k=\frac{\sqrt{-\Delta}}{2}## the general solution is

$$y(x)=e^{-ax/2}(c_1\sin{kx}+c_2\cos{kx})$$

So here is my question.
In the other two possible cases (##\Delta=0## or ##\Delta >0##) we can write the general solutions, respectively, as

$$y(x)=c_1e^{rx}+c_2xe^{rx}=c_1+c_2x$$

$$y(x)=c_1e^{r_1x}+c_2e^{r_2x}$$

Is there a way to express the general solution in the case ##\Delta <0## in a form similar to these?

For example, suppose ##\Delta <0##. Then the two roots are ##\frac{a\pm\sqrt{\Delta}}{2}##.

Let's start with the general solution

$$y(x)=e^{-ax/2}(c_1\sin{kx}+c_2\cos{kx})$$

$$=e^{-ax/2}A\cos{(kx-\phi)}$$

where ##k=\frac{\sqrt{-\Delta}}{2}##, ##A=\sqrt{c_1^2+c_2^2}##, and ##\phi=\tan^{-1}{\frac{c_1}{c_2}}##.

This solution is the real part of a complex solution

$$y(x)=\Re(z(x))=\Re[e^{-ax/2}A(\cos{(kx-\phi)}+i\sin{(kx-\phi)})]$$

$$=\Re[e^{-ax/2}Ae^{i(kx-\phi)}]$$

$$=\Re[Ae^{-i\phi}e^{\frac{-a+\sqrt{-\Delta}}{2}x}]$$

By a similar derivation we can also show that

$$y(x)=\Re[Ae^{-i\phi}e^{\frac{-a-\sqrt{-\Delta}}{2}x}]$$

It seems that any linear combination of these two complex solutions is also a complex solution and, furthermore, the real part is a solution to the original real equation.

But I don't know what to conclude from this.

An alternative way is to do the following

$$y(x)=\Re(z(x))=\Re[e^{-ax/2}A(\cos{(kx-\phi)}+i\sin{(kx-\phi)})]$$

$$y(x)=\Re[e^{-ax/2}(A\cos{\phi}\cos{kx}+A\sin{\phi}\sin{kx}+i(A\sin{kx}\cos{\phi}-A\cos{kx}\sin{\phi}))]$$

where we used

$$A\sin{(kx-\phi)}=A(\sin{kx}\cos{\phi}-\cos{kx}\sin{\phi})$$

$$=(A\cos{\phi})\sin{kx}-(A\sin{\phi})\cos{kx}$$

Then

$$y(x)=\Re[e^{-ax/2}(A\cos{\phi}(\cos{kx}+i\sin{kx})+A\sin{\phi}(\sin{kx}-i\cos{kx})]$$

$$=\Re[e^{-ax/2}(A\cos{\phi}e^{ikx}-iA\sin{\phi}e^{ikx})]$$
 
Last edited:
Physics news on Phys.org
Good point. The solutions for ##\Delta \ge 0## are not oscillating. Your solutions for the case of ##\Delta \lt 0## work for the equations with oscillating solutions. There is usually some cyclical trade-off between two physical (or mathematical) quantities which are represented by the real and imaginary parts of your solution. The oscillations can be stable (damped) or unstable.
 
If y&#039;&#039; - 2py&#039; + qy = 0 has real cofficients and z is a complex-valued solution, then it is easy to check that the complex conjugate \bar{z} is also a solution. Since the ODE is linear, it follows that both (z + \bar{z})/2 = \Re(z) and (z - \bar{z})/(2i) = \Im(z) are linear combinations of solutions and therefore also solutions, which happen to be real-valued.

We can always write Ae^{r_1x} + Be^{r_2x} = \exp\left(\frac{ r_1 + r_2 }2 x\right)\left(<br /> C\cosh\left(\frac{r_1 - r_2}2 x\right) + D\sinh\left(\frac{r_1 - r_2}2x\right)\right) where C = A + B and D = A - B. The second formulation is sometimes preferable, since it makes it trivial to apply initial conditions on y and y&#039; at x = 0, and to apply an iniitial condition at x = x_0 we can simply replace x with x - x_0 throughout.

Since the cofficients of the ODE are real, r_1 and r_2 are either both real or a complex conjugate pair, and in the latter case we have <br /> e^{\Re(r)x}(C \cosh(i\Im(r)x) + D\sinh(i\Im(r)x) = e^{\Re(r)x}(C\cos(\Im(r)x) + iD\sin(\Im(r)x)) and the result will be real-valued provided C and iD are real. This requires that B = \bar{A}.
 
Last edited:
  • Like
Likes FactChecker
First, I tried to show that ##f_n## converges uniformly on ##[0,2\pi]##, which is true since ##f_n \rightarrow 0## for ##n \rightarrow \infty## and ##\sigma_n=\mathrm{sup}\left| \frac{\sin\left(\frac{n^2}{n+\frac 15}x\right)}{n^{x^2-3x+3}} \right| \leq \frac{1}{|n^{x^2-3x+3}|} \leq \frac{1}{n^{\frac 34}}\rightarrow 0##. I can't use neither Leibnitz's test nor Abel's test. For Dirichlet's test I would need to show, that ##\sin\left(\frac{n^2}{n+\frac 15}x \right)## has partialy bounded sums...