# How to prove asymptotic properties

• jostpuur
jostpuur
I'm interested in solutions of an equation

$$f'(x) = -\frac{xf(x)}{Af(x)+ Bx^2}$$

with some positive initial value $f(x)>0$, and with positive constants $A,B>0$.

First question: Does an explicit formula exist? I couldn't figure it out.

Second question:

I see that $f(x)>0\implies f'(x)<0$, and on the other hand a constant $f=0$ is a solution for all $x>0$. So clearly $0< f(x)<f(0)$ will hold for all $0<x<\infty$. Therefore with large $x$ we have $Af(x)+Bx^2\approx Bx^2$ and

$$f'(x)\approx -\frac{f(x)}{Bx}$$

which implies that in some sense

$$f(x) \sim x^{-\frac{1}{B}}$$

will probably hold. The second question is that how do you prove something rigor with this approximation. The approximations

$$f(x) = O\big(x^{-\frac{1}{B}}\big)$$

and

$$f(x)= Cx^{-\frac{1}{B}}+ O\big(x^{-\frac{1}{B}-1}\big)$$

probably hold, but how do you prove them?

Mandelbroth
I'm interested in solutions of an equation

$$f'(x) = -\frac{xf(x)}{Af(x)+ Bx^2}$$

with some positive initial value $f(x)>0$, and with positive constants $A,B>0$.
Gee, this looks familiar. I thought we'd save this for a rainy day! :tongue:

First question: Does an explicit formula exist? I couldn't figure it out.

Second question:

I see that $f(x)>0\implies f'(x)<0$, and on the other hand a constant $f=0$ is a solution for all $x>0$. So clearly $0< f(x)<f(0)$ will hold for all $0<x<\infty$. Therefore with large $x$ we have $Af(x)+Bx^2\approx Bx^2$ and

$$f'(x)\approx -\frac{f(x)}{Bx}$$

which implies that in some sense

$$f(x) \sim x^{-\frac{1}{B}}$$

will probably hold. The second question is that how do you prove something rigor with this approximation. The approximations

$$f(x) = O\big(x^{-\frac{1}{B}}\big)$$

and

$$f(x)= Cx^{-\frac{1}{B}}+ O\big(x^{-\frac{1}{B}-1}\big)$$

probably hold, but how do you prove them?
How do you think we should start? You came to these conclusions, so you might know the path to get to them and just might not have realized it. Your intuition might be the best place to start with the rigor here.

jostpuur
Gee, this looks familiar. I thought we'd save this for a rainy day! :tongue:

Yes it is no secret that this is coming from my attempts with the rain drop in mist problem from thread Solved Physics Challenge I: The Raindrop solved by mfb and voko

This ODE problem has become quite distant from the physics problem, so IMO own thread for this is ok. Also, techniques that solve this problem could also turn out useful with other ODEs.

How do you think we should start? You came to these conclusions, so you might know the path to get to them and just might not have realized it. Your intuition might be the best place to start with the rigor here.

No! No! The most promising way to start is of course to recall the rigor definitions of the asymptotic notations

$$f(x) = O\big(x^{-\frac{1}{B}}\big)$$

This means that there exists $C,R>0$ such that

$$x\geq R\implies f(x)\leq Cx^{-\frac{1}{B}}$$

Let's denote $g(x)=Cx^{-\frac{1}{B}}$. The most obvious way to prove the desired inequality is to prove $f(R)\leq g(R)$ and $f'(R)\leq g'(R)$ which would imply it. In other words I should try to prove

$$-\frac{xf(x)}{Af(x) + Bx^2} \leq -\frac{g(x)}{Bx}$$

for $x\geq R$. The inequality is equivalent with

$$Cx^{-\frac{1}{B}} \leq \frac{Bx^2 f(x)}{Af(x) + Bx^2}$$

How do you prove this? With large $x$ the right side is roughly $\approx f(x)$, which is in contradiction with the inequality that we are trying to prove.

So I'm feeling little confused now... This isn't looking like working.

Mandelbroth
$$Cx^{-\frac{1}{B}} \leq \frac{Bx^2 f(x)}{Af(x) + Bx^2}$$

How do you prove this? With large $x$ the right side is roughly $\approx f(x)$, which is in contradiction with the inequality that we are trying to prove.

So I'm feeling little confused now... This isn't looking like working.
I thought we were trying to prove ##f(x)\sim x^{-1/B}##...?

I'm honestly not seeing the logic in some of this. You say that ##f(x)=0## is a solution, but then say that ##0<f(x)##. I was hoping you'd shed some more light on your thinking behind your expressions, which may then show a way to prove them. I'm off to bed now, but hopefully I'll be able to work on this in the morning.

jostpuur
I thought we were trying to prove ##f(x)\sim x^{-1/B}##...?

I made it clear that I didn't mean anything specific with "$\sim$" notation there.

... which implies that in some sense

$$f(x) \sim x^{-\frac{1}{B}}$$

will probably hold.

Starting with easiest results, I'm now trying to prove this:

...there exists $C,R>0$ such that

$$x\geq R\implies f(x)\leq Cx^{-\frac{1}{B}}$$

How did you interpret "$\sim$"? Do you see it as something simpler?

Mandelbroth
How did you interpret "$\sim$"? Do you see it as something simpler?
Typically, in topology and related areas, ##\sim## denotes an arbitrary equivalence relation. I assumed it meant something like "asymptotically equal to."

Suppose we have $$\frac{df}{dx}=-\frac{xf(x)}{Af(x)+Bx^2}=\operatorname{RHS}(x,f(x)).$$ Observe that ##|\frac{\partial \operatorname{RHS}}{\partial f}|=|\frac{Bx^3}{(Bx^2+Ay)^2}|##. What happens as x grows larger? Is there an interval of ##x## where ##\operatorname{RHS}## is continuous in ##x## and Lipschitz continuous in ##f##?

jostpuur
The most obvious way to prove the desired inequality is to prove $f(R)\leq g(R)$ and $f'(R)\leq g'(R)$ which would imply it.

Little fix: I meant $f'(x)\leq g'(x)$ for $x\geq R$.

jackmell
I'm interested in solutions of an equation

$$f'(x) = -\frac{xf(x)}{Af(x)+ Bx^2}$$

with some positive initial value $f(x)>0$, and with positive constants $A,B>0$.

First question: Does an explicit formula exist? I couldn't figure it out.

Write it as:
$$y'=-\frac{xy}{ay+bx^2}$$
or:
$$(xy)dx+(ay+bx^2)dy=0$$
Now, can you find an integrating factor for that?

Last edited:
jostpuur
$$x\geq R\implies f(x)\leq Cx^{-\frac{1}{B}}$$

The way in which $f$ approaches zero seems to be very mysterious. For example, one might think it would be easier to prove

$$f(x)\leq Cx^{-\frac{1}{B}+\epsilon}$$

for large $x$, but the condition

$$f'(x)\leq C\Big(-\frac{1}{B}+\epsilon\Big)x^{-\frac{1}{B}-1+\epsilon}$$

eventually turns out to be equivalent with a claim that $f$ remains above some positive lower bound, which contradicts the convergence to zero.

jostpuur
I just realized I've been careless with these derivative ideas. The condition $f'\leq g'$ only makes sense on finite intervals, or if $f\to-\infty$ faster than $g\to-\infty$, or something of that kind. If $f\to 0$ faster from positive side than $g\to 0$ then $g'\leq f'$ is inevitable. Oh dear... I'll need to check some ideas with logarithms next...

$$(xy)dx+(ay+bx^2)dy=0$$
Now, can you find an integrating factor for that?

No I don't know how to find an integrating factor for that.

Homework Helper
2022 Award
I'm interested in solutions of an equation

$$f'(x) = -\frac{xf(x)}{Af(x)+ Bx^2}$$

with some positive initial value $f(x)>0$, and with positive constants $A,B>0$.

First question: Does an explicit formula exist? I couldn't figure it out.

I can get an implicit formula valid for $x > 0$.

First introduce the dependent variable $u = f(x)/f(0)$ so that $u(0) = 1$. Then introduce the independent variable $t = x/\sqrt{f(0)^2A}$. Then we get the ODE
$$\dot u = -\frac{ut}{u + ct^2}$$
where $c = Bf(0) > 0$. Now
$$-\frac{ut}{u + ct^2} = -ut(ct^2)^{-1}\left(1 + \frac{u}{ct^2}\right)^{-1} = - \frac{u}{ct} \left(1 + \frac{u}{ct^2}\right)^{-1}$$
which suggests the substitution $v = u/(ct^2)$. Then
$$\dot u = -vt\frac{1}{1 + v}$$
and
$$\dot v = \frac{\dot u}{ct^2} - \frac{2u}{ct^3} = \frac{\dot u}{ct^2} - \frac{2v}{t} = -\frac{v}{ct}\left(\frac{1}{1 + v} + 2c\right)$$
which is separable:
$$\frac{1 + v}{v(1 + 2c + 2cv)} \dot v = -\frac{1}{ct}$$
and has general solution (for $t > 0$)
$$v^{1/(1 + 2c)}(1 + 2c + 2cv)^{(1 - 2c)/(2c(1 + 2c))} = Dt^{-1/c}$$
Solving that for $f(x) = (B/A)x^2 v(x/\sqrt{Af(0)^2})$ is not attractive.

However, when $t$ is large, I think we have $1 + 2c + 2cv = 1 + 2c + 2u/t^2 \sim 1 + 2c$ so $v \sim kt^{-(1 + 2c)/c}$ for some constant $k$, and
$$f(x) \sim Kx^{2-(1 + 2c)/c} = Kx^{-1/(Bf(0))}$$
for some other constant $K$.

Last edited:
jostpuur
Now comes a successful proof!

The goal:

$$f(x) = O(x^{-\frac{1}{B}+\epsilon})\quad\quad\textrm{when}\;x\to\infty$$

with all $\epsilon>0$. The claim means that we must find $R,C>0$ such that

$$R\geq x\quad\implies\quad f(x)\leq C x^{-\frac{1}{B}+\epsilon}$$

The inequality is equivalent with

$$\log(f(x)) \leq \log(C) + \Big(-\frac{1}{B} +\epsilon\Big)\log(x)$$

Once $R$ has been fixed, $C$ can be chosen so that the inequality holds at $x=R$. Then we need to prove

$$\frac{f'(x)}{f(x)}\leq \Big(-\frac{1}{B}+\epsilon\Big)\frac{1}{x}$$

to get the other values $x>R$. Using the ODE for $f$ we see that this is equivalent with

$$1 - \epsilon B \leq \frac{Bx^2}{Af(x) + Bx^2}$$

Since the right side approaches one, we see that with sufficiently large $x$ the inequality comes true. Thus the $R$ can be found.

The asymptotic result has been proven!

Homework Helper
2022 Award
Now comes a successful proof!

The goal:

$$f(x) = O(x^{-\frac{1}{B}+\epsilon})\quad\quad\textrm{when}\;x\to\infty$$

with all $\epsilon>0$. The claim means that we must find $R,C>0$ such that

$$R\geq x\quad\implies\quad f(x)\leq C x^{-\frac{1}{B}+\epsilon}$$

Should this not be $x > R$, since we're interested in the limit $x \to \infty$?

The inequality is equivalent with

$$\log(f(x)) \leq \log(C) + \Big(-\frac{1}{B} +\epsilon\Big)\log(x)$$

Once $R$ has been fixed, $C$ can be chosen so that the inequality holds at $x=R$.

Then we need to prove

$$\frac{f'(x)}{f(x)}\leq \Big(-\frac{1}{B}+\epsilon\Big)\frac{1}{x}$$
to get the other values $x>R$.

I can't follow the remainder of your proof. However, there is an alternative:

We know $f$ is strictly decreasing, because its derivative is everywhere negative. So we know that for all $R > 0$, if $x > R$ then $f(x) < f(R)$. Also log is strictly increasing, so if there exist $C$ and $R$ such that
$$\log f(R) \leq \log C + (-B^{-1} + \epsilon)\log R$$
then we have immediately that if $x > R$ then
$$\log f(x) < \log f(R) \leq \log C + (-B^{-1} + \epsilon)\log R < \log C + (-B^{-1} + \epsilon)\log x$$
as required.

Fix $R$. Then the middle inequality holds provided $C \geq f(R)/R^{-B^{-1} + \epsilon}$. Such a $C$ can always be found since the right hand side is finite.

jostpuur
$$f(x) \sim Kx^{2-(1 + 2c)/c} = Kx^{-1/(Bf(0))}$$
for some other constant $K$.

This result contradicts the result I proved, for initial values $f(0)>1$.