Condition for ODE Solutions with Variation of Parameters

  • Thread starter Thread starter twoflower
  • Start date Start date
  • Tags Tags
    Condition Ode
Click For Summary

Homework Help Overview

The discussion revolves around the method of variation of parameters applied to higher order ordinary differential equations (ODEs). The original poster presents a specific ODE and expresses confusion regarding a condition related to the derivatives of functions involved in the solution process.

Discussion Character

  • Exploratory, Assumption checking, Conceptual clarification

Approaches and Questions Raised

  • Participants explore the necessity of the condition that the sum of the derivatives of the functions must equal zero. There are attempts to understand the implications of setting this condition and its role in simplifying the solution process.

Discussion Status

Participants are actively engaging with the problem, questioning the assumptions behind the condition and discussing its significance in the context of finding particular solutions. Some guidance has been offered regarding the generality of the method and the implications of the condition on the derivatives.

Contextual Notes

There is an ongoing exploration of the implications of making certain assumptions, such as setting the function g(x) to zero, and whether this is valid within the context of the problem. The discussion reflects uncertainty about the correctness of these assumptions and their impact on the solution.

twoflower
Messages
363
Reaction score
0
Hi,

I just started playing with higher order ODEs and I'm stuck in one particular step. Here it is:

[tex] y^{''} + y = \frac{1}{\cos x}[/tex]

1. step: I find fundamental solution system, which in this case is

[tex] [\cos x, \sin x][/tex]

So general solution looks like this:

[tex] y(x) = \alpha\cos x + \beta \sin x[/tex]

Using the method of variation of parameters, [itex]\alpha[/itex] and [itex]\beta[/itex] become functions of x:

[tex] y(x) = \alpha(x)\cos x + \beta(x) \sin x[/tex]

[tex] y'(x) = \alpha^{'}(x)\cos x - \alpha(x)\sin x + \beta^{'}(x) \sin x + \beta(x) \cos x[/tex]

Now I don't understand the condition

[tex] \alpha^{'}(x)\cos x + \beta^{'}(x) \sin x = 0[/tex]

Why does it have to be so?

Thanks for explanation!
 
Physics news on Phys.org
Well you know it has to be something right? I mean:

[tex]\alpha^{'}(x)Cos(x)+\beta^{'}(x)Sin(x)=g(x)[/tex]

Tell you what though, let's just make g(x) equal to the zero function and just see what happens. No harm in that right? I mean we're not talking asteriods or nothing? If we do, the math is certainly much easier when the second derivative is calculated and all the subsequent arithmetic is valid and we end up with a valid answer.

Works for me.
 
saltydog said:
Well you know it has to be something right? I mean:
[tex]\alpha^{'}(x)Cos(x)+\beta^{'}(x)Sin(x)=g(x)[/tex]
Tell you what though, let's just make g(x) equal to the zero function and just see what happens. No harm in that right? I mean we're not talking asteriods or nothing? If we do, the math is certainly much easier when the second derivative is calculated and all the subsequent arithmetic is valid and we end up with a valid answer.
Works for me.

I still do not quite understand...What I thought is that we're finding for ONE particular solution, no mother which one of infinite number of them, so we FOR EXAMPLE, make sum of these derivatives equal to zero functions.

Is that it?

Anyway, I can't see whether it is really a correct step, I mean, we don't know it this sum really can be zero...you know what I mean.
 
Think how general this method is: any pair of functions can give any other function this way: If the two given functions are sin(x) and cos(x) and you want to get ex, just write
[tex]\frac{e^x}{2sin(x)}sin(x)+ \frac{e^x}{2cos(x)}cos(x). There are an infinite number of functions that will give you a solution. You are just "limiting the search" by requiring that [tex]\alpha'(x)sin(x)+ \beta'(x)cos(x)= 0[/tex]. You use that particular requirement because that way, when you differentiate again, you wind up with a first order equation for [itex]\alpha(x)[/itex] and [itex]\beta(x)[/itex].<br /> <br /> Here's an exercise: suppose you were third order equation that had solutions (to the homogeneous equation) [itex]y_1(x),y_2(x), y_3(x)[/itex] and we seek a solution of the form [itex]y(x)= \alpha(x)y_1(x)+\beta(x)y_2(x)+\gamma(x)y_3(x)[/itex]. y'(x)= \alpha'(x)y_1(x)+\beta(x)'y_2(x)+\gamma'(x)y_3(x)+ \alpha(x)y_1'(x)+\beta(x)y_2'(x)+\gamma(x)y_3'(x)[/itex]. If we set<br /> [itex]\alpha'(x)y_1(x)+\beta'(x)y_2(x)+\gamma'(x)y_3(x)[/itex] we are left with<br /> [itex]\alpha(x)y_1'(x)+\beta(x)y_2'(x)+\gamma(x)y_3'(x)[/itex]. Differentiating again, \alpha(x)'y_1'(x)+\beta'(x)y_2'(x)+\gamma'(x)y_3'(x)+ \alpha(x)y_1"(x)+\beta(x)y_2"(x)+\gamma(x)y_3"(x)[/itex]. What condition do we impose so that when we differentiate again (to get [itex]y_1"'[/itex], etc.) we still have only first derivatives of [itex]\alpha(x)[/itex], etc?[/tex]
 
HallsofIvy said:
Think how general this method is: any pair of functions can give any other function this way: If the two given functions are sin(x) and cos(x) and you want to get ex, just write
[tex]\frac{e^x}{2sin(x)}sin(x)+ \frac{e^x}{2cos(x)}cos(x)[/tex]. There are an infinite number of functions that will give you a solution. You are just "limiting the search" by requiring that [tex]\alpha'(x)sin(x)+ \beta'(x)cos(x)= 0[/tex]. You use that particular requirement because that way, when you differentiate again, you wind up with a first order equation for [itex]\alpha(x)[/itex] and [itex]\beta(x)[/itex].
Here's an exercise: suppose you were third order equation that had solutions (to the homogeneous equation)

[tex]y_1(x),y_2(x), y_3(x)[/tex]

and we seek a solution of the form

[tex] y(x)= \alpha(x)y_1(x)+\beta(x)y_2(x)+\gamma(x)y_3(x)[/tex].

[tex] y'(x)= \alpha'(x)y_1(x)+\beta(x)'y_2(x)+\gamma'(x)y_3(x)+ \alpha(x)y_1'(x)+\beta(x)y_2'(x)+\gamma(x)y_3'(x)[/tex].

If we set

[tex] \alpha'(x)y_1(x)+\beta'(x)y_2(x)+\gamma'(x)y_3(x)[/tex]

we are left with

[tex] \alpha(x)y_1'(x)+\beta(x)y_2'(x)+\gamma(x)y_3'(x)[/tex].

Differentiating again,

[tex] \alpha(x)'y_1'(x)+\beta'(x)y_2'(x)+\gamma'(x)y_3'(x)+\alpha(x)y_1"(x)+\beta(x)y_2"(x)+\gamma(x)y_3"(x)[/tex].

What condition do we impose so that when we differentiate again (to get [itex]y_1"'[/itex], etc.) we still have only first derivatives of [itex]\alpha(x)[/itex], etc?

I see it, the condition is

[tex] \alpha(x)'y_1'(x)+\beta'(x)y_2'(x)+\gamma'(x)y_3'(x) = 0[/tex]

Thank you HallsoftIvy!
 
Last edited:

Similar threads

Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
Replies
2
Views
2K
Replies
3
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K