How to solve this nonlinear differential equation numerically?

In summary, the boundary condition at x=0 is f(x)=0, so the function can be solved numerically. There is a singularity at the origin, so the equation cannot be solved analytically. However, using a taylor expansion around x=0 and integrating outside from x_0<<1, two independent solutions can be found.
  • #1
wdlang
307
0
-f''-(2/x)f'+(2/x^2)f+f^3-f=0

the boundary condition is f(x=0)=0 and f(x=\infty)=1

how to solve f numerically?
 
Physics news on Phys.org
  • #2
This is similar to the Riccati equation. I hope one of the methods for tackling the Riccati equation is of some use.
 
  • #3
Since you have two boundary conditions at two different points, this is usually solved by the shooting method, where you assume a value of f'(0), and then integrate out to see if it satisfies the boundary condition at infinity, then adjust your assumption to match the desired boundary condition. However, I tried this briefly using Mathematica's NDSolve function:

NDSolve[{-D[D[f[x], x], x] - 2/x D[f[x], x] + 2/x^2 f[x] + f[x]^3 -
f[x] == 0, f[.00001] == 0,
f'[.00001] == 1.0}, f[x], {x, .00001, 20}]

and I got oscillatory solutions that tended to zero as x->inf (see attachment). Perhaps there is a particluar value of f' at zero that will give the solution you want?
 

Attachments

  • Graph.pdf
    37.1 KB · Views: 318
  • #4
phyzguy said:
Since you have two boundary conditions at two different points, this is usually solved by the shooting method, where you assume a value of f'(0), and then integrate out to see if it satisfies the boundary condition at infinity, then adjust your assumption to match the desired boundary condition. However, I tried this briefly using Mathematica's NDSolve function:

NDSolve[{-D[D[f[x], x], x] - 2/x D[f[x], x] + 2/x^2 f[x] + f[x]^3 -
f[x] == 0, f[.00001] == 0,
f'[.00001] == 1.0}, f[x], {x, .00001, 20}]

and I got oscillatory solutions that tended to zero as x->inf (see attachment). Perhaps there is a particluar value of f' at zero that will give the solution you want?

thanks a lot

but i am always concerned with the singularity at the origin

the coefficients diverge there

so i do not know how to deal with it
 
  • #5
the derivative f' at the origin has to be scanned

but i am concerned with how to integrate the eq accurately for a given f'
 
  • #6
wdlang said:
thanks a lot

but i am always concerned with the singularity at the origin

the coefficients diverge there

so i do not know how to deal with it

There is no singularity at the origin. Assume as x->0 that f looks like f = K x. Then f->0, f^3 ->0, f''->0, and f'->K, and the equation reduces to: -2K/x+2K/x=0. So everything is well behaved. In essence, the two "singular" terms cancel. You can then start your integration at some small distance away from the origin (I chose x=1E-5, but it won't really matter as long as you start where the other terms are negligible).
 
  • #7
You can simplify things making [tex]y(x)=u(x)/x[/tex]. Then you have:
[tex]u''+(1-2/x^{2})u-u^{3}/x^{2}[/tex]
this equation has two independent solutions as [tex]x\rightarrow \infty[/tex], namely:
[tex]u\rightarrow C_{1}Sin(x)+C_{2}Cos(x)[/tex]
You can integrate the equation backwards. I am not very sure about there being no singularity. If you keep all the terms, except for [tex]u^{3}/x^{2}[/tex], you still obtain a solution:
[tex]u(x)\equiv C_{1}(Sin(x)-Cos(x)/x)+C_{2}(Cos(x)-Sin(x)/x)[/tex]
which should be valid before the neglected term is noticeable. I see no way for any solution being bounded at x=0, unless the term with [tex]u^3[/tex] is dominant somehow
 
  • #8
gato_ said:
You can simplify things making [tex]y(x)=u(x)/x[/tex]. Then you have:
[tex]u''+(1-2/x^{2})u-u^{3}/x^{2}[/tex]
this equation has two independent solutions as [tex]x\rightarrow \infty[/tex], namely:
[tex]u\rightarrow C_{1}Sin(x)+C_{2}Cos(x)[/tex]
You can integrate the equation backwards. I am not very sure about there being no singularity. If you keep all the terms, except for [tex]u^{3}/x^{2}[/tex], you still obtain a solution:
[tex]u(x)\equiv C_{1}(Sin(x)-Cos(x)/x)+C_{2}(Cos(x)-Sin(x)/x)[/tex]
which should be valid before the neglected term is noticeable. I see no way for any solution being bounded at x=0, unless the term with [tex]u^3[/tex] is dominant somehow

i guess the substitution y=xu may be better?
 
  • #9
The change I suggest is standard. It removes the first order differentiation, allowing to study the decay and the oscillation separately
 
  • #10
gato_ said:
The change I suggest is standard. It removes the first order differentiation, allowing to study the decay and the oscillation separately

thanks a lot.

any reference?
 
  • #11
gato_ said:
The change I suggest is standard. It removes the first order differentiation, allowing to study the decay and the oscillation separately

how about guess a taylor expansion of f around x=0

and integrate outside from x_0<<1?

the boundary condition at x=0 indicates that

f(x)=ax + b x^2 + c x^3+ d x^4+...

we can determine the relations between a, b, c, d by the differential equation.
 

1. What is a nonlinear differential equation?

A nonlinear differential equation is an equation that involves the derivatives of a function and the function itself in a nonlinear way. This means that the terms in the equation are not all linear (i.e. in the form y = mx + b), making it more difficult to solve compared to linear differential equations.

2. Why is it important to solve nonlinear differential equations numerically?

Nonlinear differential equations often have complex and unpredictable solutions, making it difficult to find analytical solutions. Therefore, numerical methods are necessary to approximate the solutions and provide insights into the behavior of the system being modeled.

3. What are some commonly used numerical methods for solving nonlinear differential equations?

Some commonly used numerical methods for solving nonlinear differential equations include Euler's method, Runge-Kutta methods, and finite difference methods. These methods involve approximating the solution at discrete points and using iterative calculations to improve the accuracy of the solution.

4. How do I know if my numerical solution is accurate?

The accuracy of a numerical solution for a nonlinear differential equation can be evaluated by comparing it to known analytical solutions (if available) or by using convergence tests. These tests involve comparing the solution obtained using smaller and smaller step sizes to see if it converges to a specific value.

5. Are there any limitations to solving nonlinear differential equations numerically?

Yes, there are limitations to solving nonlinear differential equations numerically. These include the sensitivity of the solution to initial conditions and step size, as well as the possibility of encountering numerical instability. It is important to carefully choose the appropriate numerical method and parameters to ensure a reliable and accurate solution.

Similar threads

  • Differential Equations
Replies
3
Views
2K
  • Differential Equations
Replies
3
Views
1K
  • Differential Equations
Replies
7
Views
332
  • Differential Equations
Replies
1
Views
1K
  • Differential Equations
Replies
5
Views
608
  • Differential Equations
Replies
1
Views
624
  • Differential Equations
Replies
7
Views
1K
Replies
1
Views
2K
  • Differential Equations
Replies
2
Views
916
  • Differential Equations
Replies
1
Views
1K
Back
Top