# Why this condition in ODE?

1. Jan 3, 2006

### twoflower

Hi,

I just started playing with higher order ODEs and I'm stuck in one particular step. Here it is:

$$y^{''} + y = \frac{1}{\cos x}$$

1. step: I find fundamental solution system, which in this case is

$$[\cos x, \sin x]$$

So general solution looks like this:

$$y(x) = \alpha\cos x + \beta \sin x$$

Using the method of variation of parameters, $\alpha$ and $\beta$ become functions of x:

$$y(x) = \alpha(x)\cos x + \beta(x) \sin x$$

$$y'(x) = \alpha^{'}(x)\cos x - \alpha(x)\sin x + \beta^{'}(x) \sin x + \beta(x) \cos x$$

Now I don't understand the condition

$$\alpha^{'}(x)\cos x + \beta^{'}(x) \sin x = 0$$

Why does it have to be so?

Thanks for explanation!

2. Jan 3, 2006

### saltydog

Well you know it has to be something right? I mean:

$$\alpha^{'}(x)Cos(x)+\beta^{'}(x)Sin(x)=g(x)$$

Tell you what though, let's just make g(x) equal to the zero function and just see what happens. No harm in that right? I mean we're not talking asteriods or nothing? If we do, the math is certainly much easier when the second derivative is calculated and all the subsequent arithmetic is valid and we end up with a valid answer.

Works for me.

3. Jan 3, 2006

### twoflower

I still do not quite understand...What I thought is that we're finding for ONE particular solution, no mother which one of infinite number of them, so we FOR EXAMPLE, make sum of these derivatives equal to zero functions.

Is that it?

Anyway, I can't see whether it is really a correct step, I mean, we don't know it this sum really can be zero...you know what I mean.

4. Jan 3, 2006

### HallsofIvy

Staff Emeritus
Think how general this method is: any pair of functions can give any other function this way: If the two given functions are sin(x) and cos(x) and you want to get ex, just write
$$\frac{e^x}{2sin(x)}sin(x)+ \frac{e^x}{2cos(x)}cos(x). There are an infinite number of functions that will give you a solution. You are just "limiting the search" by requiring that [tex]\alpha'(x)sin(x)+ \beta'(x)cos(x)= 0$$. You use that particular requirement because that way, when you differentiate again, you wind up with a first order equation for $\alpha(x)$ and $\beta(x)$.

Here's an exercise: suppose you were third order equation that had solutions (to the homogeneous equation) $y_1(x),y_2(x), y_3(x)$ and we seek a solution of the form $y(x)= \alpha(x)y_1(x)+\beta(x)y_2(x)+\gamma(x)y_3(x)$. y'(x)= \alpha'(x)y_1(x)+\beta(x)'y_2(x)+\gamma'(x)y_3(x)+ \alpha(x)y_1'(x)+\beta(x)y_2'(x)+\gamma(x)y_3'(x)[/itex]. If we set
$\alpha'(x)y_1(x)+\beta'(x)y_2(x)+\gamma'(x)y_3(x)$ we are left with
$\alpha(x)y_1'(x)+\beta(x)y_2'(x)+\gamma(x)y_3'(x)$. Differentiating again, \alpha(x)'y_1'(x)+\beta'(x)y_2'(x)+\gamma'(x)y_3'(x)+ \alpha(x)y_1"(x)+\beta(x)y_2"(x)+\gamma(x)y_3"(x)[/itex]. What condition do we impose so that when we differentiate again (to get $y_1"'$, etc.) we still have only first derivatives of $\alpha(x)$, etc?

5. Jan 3, 2006

### twoflower

I see it, the condition is

$$\alpha(x)'y_1'(x)+\beta'(x)y_2'(x)+\gamma'(x)y_3'(x) = 0$$

Thank you HallsoftIvy!

Last edited: Jan 3, 2006