Condition for ODE Solutions with Variation of Parameters

  • Thread starter Thread starter twoflower
  • Start date Start date
  • Tags Tags
    Condition Ode
twoflower
Messages
363
Reaction score
0
Hi,

I just started playing with higher order ODEs and I'm stuck in one particular step. Here it is:

<br /> y^{&#039;&#039;} + y = \frac{1}{\cos x}<br />

1. step: I find fundamental solution system, which in this case is

<br /> [\cos x, \sin x]<br />

So general solution looks like this:

<br /> y(x) = \alpha\cos x + \beta \sin x<br />

Using the method of variation of parameters, \alpha and \beta become functions of x:

<br /> y(x) = \alpha(x)\cos x + \beta(x) \sin x<br />

<br /> y&#039;(x) = \alpha^{&#039;}(x)\cos x - \alpha(x)\sin x + \beta^{&#039;}(x) \sin x + \beta(x) \cos x<br />

Now I don't understand the condition

<br /> \alpha^{&#039;}(x)\cos x + \beta^{&#039;}(x) \sin x = 0<br />

Why does it have to be so?

Thanks for explanation!
 
Physics news on Phys.org
Well you know it has to be something right? I mean:

\alpha^{&#039;}(x)Cos(x)+\beta^{&#039;}(x)Sin(x)=g(x)

Tell you what though, let's just make g(x) equal to the zero function and just see what happens. No harm in that right? I mean we're not talking asteriods or nothing? If we do, the math is certainly much easier when the second derivative is calculated and all the subsequent arithmetic is valid and we end up with a valid answer.

Works for me.
 
saltydog said:
Well you know it has to be something right? I mean:
\alpha^{&#039;}(x)Cos(x)+\beta^{&#039;}(x)Sin(x)=g(x)
Tell you what though, let's just make g(x) equal to the zero function and just see what happens. No harm in that right? I mean we're not talking asteriods or nothing? If we do, the math is certainly much easier when the second derivative is calculated and all the subsequent arithmetic is valid and we end up with a valid answer.
Works for me.

I still do not quite understand...What I thought is that we're finding for ONE particular solution, no mother which one of infinite number of them, so we FOR EXAMPLE, make sum of these derivatives equal to zero functions.

Is that it?

Anyway, I can't see whether it is really a correct step, I mean, we don't know it this sum really can be zero...you know what I mean.
 
Think how general this method is: any pair of functions can give any other function this way: If the two given functions are sin(x) and cos(x) and you want to get ex, just write
\frac{e^x}{2sin(x)}sin(x)+ \frac{e^x}{2cos(x)}cos(x). There are an infinite number of functions that will give you a solution. You are just &quot;limiting the search&quot; by requiring that \alpha&amp;#039;(x)sin(x)+ \beta&amp;#039;(x)cos(x)= 0. You use that particular requirement because that way, when you differentiate again, you wind up with a first order equation for \alpha(x) and \beta(x).<br /> <br /> Here&#039;s an exercise: suppose you were third order equation that had solutions (to the homogeneous equation) y_1(x),y_2(x), y_3(x) and we seek a solution of the form y(x)= \alpha(x)y_1(x)+\beta(x)y_2(x)+\gamma(x)y_3(x). y&#039;(x)= \alpha&#039;(x)y_1(x)+\beta(x)&#039;y_2(x)+\gamma&#039;(x)y_3(x)+ \alpha(x)y_1&#039;(x)+\beta(x)y_2&#039;(x)+\gamma(x)y_3&#039;(x)[/itex]. If we set<br /> \alpha&amp;#039;(x)y_1(x)+\beta&amp;#039;(x)y_2(x)+\gamma&amp;#039;(x)y_3(x) we are left with<br /> \alpha(x)y_1&amp;#039;(x)+\beta(x)y_2&amp;#039;(x)+\gamma(x)y_3&amp;#039;(x). Differentiating again, \alpha(x)&#039;y_1&#039;(x)+\beta&#039;(x)y_2&#039;(x)+\gamma&#039;(x)y_3&#039;(x)+ \alpha(x)y_1&quot;(x)+\beta(x)y_2&quot;(x)+\gamma(x)y_3&quot;(x)[/itex]. What condition do we impose so that when we differentiate again (to get y_1&amp;quot;&amp;#039;, etc.) we still have only first derivatives of \alpha(x), etc?
 
HallsofIvy said:
Think how general this method is: any pair of functions can give any other function this way: If the two given functions are sin(x) and cos(x) and you want to get ex, just write
\frac{e^x}{2sin(x)}sin(x)+ \frac{e^x}{2cos(x)}cos(x). There are an infinite number of functions that will give you a solution. You are just "limiting the search" by requiring that \alpha&#039;(x)sin(x)+ \beta&#039;(x)cos(x)= 0. You use that particular requirement because that way, when you differentiate again, you wind up with a first order equation for \alpha(x) and \beta(x).
Here's an exercise: suppose you were third order equation that had solutions (to the homogeneous equation)

y_1(x),y_2(x), y_3(x)

and we seek a solution of the form

<br /> y(x)= \alpha(x)y_1(x)+\beta(x)y_2(x)+\gamma(x)y_3(x)<br />.

<br /> y&#039;(x)= \alpha&#039;(x)y_1(x)+\beta(x)&#039;y_2(x)+\gamma&#039;(x)y_3(x)+ \alpha(x)y_1&#039;(x)+\beta(x)y_2&#039;(x)+\gamma(x)y_3&#039;(x)<br />.

If we set

<br /> \alpha&#039;(x)y_1(x)+\beta&#039;(x)y_2(x)+\gamma&#039;(x)y_3(x)<br />

we are left with

<br /> \alpha(x)y_1&#039;(x)+\beta(x)y_2&#039;(x)+\gamma(x)y_3&#039;(x)<br />.

Differentiating again,

<br /> \alpha(x)&#039;y_1&#039;(x)+\beta&#039;(x)y_2&#039;(x)+\gamma&#039;(x)y_3&#039;(x)+\alpha(x)y_1&quot;(x)+\beta(x)y_2&quot;(x)+\gamma(x)y_3&quot;(x)<br />.

What condition do we impose so that when we differentiate again (to get y_1&quot;&#039;, etc.) we still have only first derivatives of \alpha(x), etc?

I see it, the condition is

<br /> \alpha(x)&#039;y_1&#039;(x)+\beta&#039;(x)y_2&#039;(x)+\gamma&#039;(x)y_3&#039;(x) = 0<br />

Thank you HallsoftIvy!
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top