Limit of the solution of a differential equation

Click For Summary
SUMMARY

The discussion centers on the differential equation \(\frac{dy}{dt} + a(t)y = f(t)\), where \(a(t) > c > 0\) and \(\lim f(t) = 0\). Participants conclude that any solution \(y(t)\) must satisfy \(\lim y(t) = 0\) as \(t \to \infty\). The reasoning involves analyzing the limits of \(y\) and its derivative \(y'\), establishing that both must converge to zero to maintain the conditions set by \(a(t)\) and \(f(t)\). The conversation also touches on the implications of oscillatory functions and the necessity of rigorous proof for the established limits.

PREREQUISITES
  • Understanding of differential equations, specifically first-order linear equations.
  • Familiarity with limit concepts in calculus.
  • Knowledge of continuous functions and their properties.
  • Experience with mathematical rigor and proof techniques.
NEXT STEPS
  • Study the method of integrating factors for solving linear differential equations.
  • Explore the properties of continuous functions and their limits in greater detail.
  • Investigate the behavior of oscillatory functions and their limits, such as \(\frac{\sin(t^2)}{t}\).
  • Learn about the conditions under which limits of derivatives exist and their implications.
USEFUL FOR

Students and professionals in mathematics, particularly those focusing on differential equations, calculus, and mathematical analysis. This discussion is beneficial for anyone seeking to deepen their understanding of limit behavior in the context of differential equations.

Ressurection
Messages
6
Reaction score
0

Homework Statement



Given the differential equation : \frac{dy}{dt} + a(t)*y = f(t)
in which a and f are continuous functions in ℝ and verify:
a(t) > c > 0 \forallt , lim f(t) = 0

Show that any solution of the differential equation satisfies:
lim y(t) = 0

Homework Equations





The Attempt at a Solution



My first thought was to apply the limit to the equation right away, so that would give me

lim (y') + lim(a) * lim(y) = 0

Now this means both the limits of y and y' must exist, and since a(t) > c > 0, either they are both 0 or one is positive and the other negative.

When lim y is a finite constant, then lim y' = 0, which only allows for the case of both limits being 0. (Side note: does lim y being finite imply that lim y' = 0?)

When limit y is +∞, this would require limit y' = -∞. Is this case possible?
My intuition tells me it isn't, but I'm not absolutely sure..

Also, if lim (a) = ∞, then I don't see any restriction that imposes lim y = 0, which makes me believe there may be a more simpler way to solve this problem.

Note: All limits refer to t → ∞.
 
Physics news on Phys.org
I think the intuition goes something like this:

\frac{dy}{dx} = -a(t)y(t) + f(t)

OK, so a(t)>0 all the time, and \lim f(t) = 0. So, if y(t)<0, then the derivative is positive (let's assume f is zero for the intuition, but you're going to have to mention the fact that its limit is zero but it is not zero) and if y(t)>0 then the derivative is less than zero. So, basically, if y is positive, it is going down, and if y is negative, it is going up. Now, that's just the intuition, so you will have to clean it up to make it rigorous (depending on how rigorous you need to make it.) But, do you understand the general idea?
 
Ressurection said:

Homework Statement



Given the differential equation : \frac{dy}{dt} + a(t)*y = f(t)
in which a and f are continuous functions in ℝ and verify:
a(t) > c > 0 \forallt , lim f(t) = 0

Show that any solution of the differential equation satisfies:
lim y(t) = 0

Homework Equations



The Attempt at a Solution



My first thought was to apply the limit to the equation right away, so that would give me

lim (y') + lim(a) * lim(y) = 0

Now this means both the limits of y and y' must exist, and since a(t) > c > 0, either they are both 0 or one is positive and the other negative.

When lim y is a finite constant, then lim y' = 0, which only allows for the case of both limits being 0. (Side note: does lim y being finite imply that lim y' = 0?)

When limit y is +∞, this would require limit y' = -∞. Is this case possible?
My intuition tells me it isn't, but I'm not absolutely sure..

Also, if lim (a) = ∞, then I don't see any restriction that imposes lim y = 0, which makes me believe there may be a more simpler way to solve this problem.

Note: All limits refer to t → ∞.
Hello Ressurection. Welcome to PF !

Consider the function \displaystyle g(t)=\frac{\sin(t^2)}{t}.

\displaystyle \lim_{t\to\infty}\,g(t)=0\ .

\displaystyle g'(t) oscillates between -2 & 2 with increasing frequency as t → ∞.
 
Last edited:
Robert1986 said:
I think the intuition goes something like this:

\frac{dy}{dx} = -a(t)y(t) + f(t)

OK, so a(t)>0 all the time, and \lim f(t) = 0. So, if y(t)<0, then the derivative is positive (let's assume f is zero for the intuition, but you're going to have to mention the fact that its limit is zero but it is not zero) and if y(t)>0 then the derivative is less than zero. So, basically, if y is positive, it is going down, and if y is negative, it is going up. Now, that's just the intuition, so you will have to clean it up to make it rigorous (depending on how rigorous you need to make it.) But, do you understand the general idea?

I think I do, because when taking the limit, the only 2 possibilities that satisfy that relation are:
- the limits do not exist (which would be the case of a sine or cosine)
- the limit is 0, which is the point that prevents the function from oscillating

SammyS said:
Hello Ressurection. Welcome tom PF !

Consider the function \displaystyle g(t)=\frac{\sin(t^2)}{t}.

\displaystyle \lim_{t\to\infty}\,g(t)=0\ .

\displaystyle g'(t) oscillates between -2 & 2 with increasing frequency as t → ∞.

That's a nice example, but taking the limit of the differential equation proves that the 2 limits must exist doesn't it? In that case, the limit of g' does not exist.


Sorry for the mess that my first post may be, I have zero experience regarding TeX
 
Try integrating factor.
 
Ressurection said:
...

That's a nice example, but taking the limit of the differential equation proves that the 2 limits must exist doesn't it? In that case, the limit of g' does not exist.
I believe that you are correct in that if the two limits exist and y(t) → 0, then also y'(t) → 0 .


Sorry for the mess that my first post may be, I have zero experience regarding TeX
You'll get the hang of using TeX.

What you wrote was pretty clear.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
Replies
3
Views
2K
Replies
2
Views
1K
Replies
2
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
8
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 19 ·
Replies
19
Views
2K