Prove that y'' + y = f(x)

  • Thread starter 0kelvin
  • Start date
  • #1
37
0
Homework Statement:
Let f be a continuous function in a interval I containing the origin and let

##y = y(x) = \int_0^x sin(x - t)f(t) dt##

Prove that ##y'' + y = f(x)## and ##y(0) = y'(0) = 0## for all x ##\in I##
Relevant Equations:
...
I know how to solve ##\frac{d}{dx} \int_0^{x^2} sin(t^2) dt## and from the statement I got that f(0) = 0 because f contains the origin and is continuous.

I tried y'(x) = sin(x - x)f(x) - sin(x - 0)f(0) but that doesn't seem to look good.
 

Answers and Replies

  • #3
15,421
13,455
We have a function ##g(x,t)=\sin(x-t)f(t)## as integrand, so that ##y(x)=\int_0^x g(x,t)\,dt##.
I would like to differentiate by ##\dfrac{d}{dx}g(x)= \displaystyle{ \int_0^x } \dfrac{\partial }{\partial x}g(x,t)\,dt## but I'm not 100% sure whether this is ok if the variable is still in the boundary.

What does your book say about differentiating convolutions?
Have a look: https://en.wikipedia.org/wiki/Convolution#Differentiation
 
  • #4
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
17,316
7,163
I'm not 100% sure whether this is ok if the variable is still in the boundary.
It is not. The easy counter example is letting g(x,t) be a non-zero constant.
 
  • #5
Delta2
Homework Helper
Insights Author
Gold Member
4,541
1,840
We have a function ##g(x,t)=\sin(x-t)f(t)## as integrand, so that ##y(x)=\int_0^x g(x,t)\,dt##.
I would like to differentiate by ##\dfrac{d}{dx}g(x)= \displaystyle{ \int_0^x } \dfrac{\partial }{\partial x}g(x,t)\,dt## but I'm not 100% sure whether this is ok if the variable is still in the boundary.

What does your book say about differentiating convolutions?
Have a look: https://en.wikipedia.org/wiki/Convolution#Differentiation
I am not so sure that the differentiation result for convolution can be applied when the boundaries of the convolution are functions of x.
 
  • #6
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
17,316
7,163
I am not so sure that the differentiation result for convolution can be applied when the boundaries of the convolution are functions of x.
It cannot.
 
  • #7
15,421
13,455
Here is an idea, derived from the principle: eliminate what disturbs!
I'm not sure whether my first step is necessary, but it is what I did. We have
\begin{align*}
y(x)&=\int_0^x \sin(x-t)f(t)\,dt\\
&=-\int_0^x \sin(t-x)f(t)\,dt\\
&\stackrel{(sx=t-x)}{=} -\int_{-1}^0 \sin(sx)\left( xf(sx+x) \right)\,ds \\
&\stackrel{(g(sx)=xf(sx+x))}{=} -\int_{-1}^0 \sin(sx) g(sx) \, ds
\end{align*}
where now ##x## is just a constant and the integral basically the same as before, with constant boundary and a modified function ##g(tx):=xf(tx+x)## which is still defined on now ##[-1,0]## and continuous.
 
  • #8
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
17,316
7,163
You can of course do something like that, but why would you? It is a simple matter of correctly applying what was linked to already in #2.
 
  • #9
Delta2
Homework Helper
Insights Author
Gold Member
4,541
1,840
Yes well Leibniz rule correctly applied gives the answer in three lines, but it is interesting what @fresh_42 does because it removes the dependence of the boundary on x.
 
  • #10
15,421
13,455
I start to like the question, because how to deal with parameter integrals as well as this useful principle to attack what disturbs can be learnt. I assume that the proof of the Leibniz rule does exactly this. But one has to be cautious with Leibniz as ##f(t) ## is not necessarily differentiable.
 
  • #11
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
17,316
7,163
While that would be workable, I think it is more common to just apply the definition of the derivative.
 
  • #12
Delta2
Homework Helper
Insights Author
Gold Member
4,541
1,840
I start to like the question, because how to deal with parameter integrals as well as this useful principle to attack what disturbs can be learnt. I assume that the proof of the Leibniz rule does exactly this. But one has to be cautious with Leibniz as ##f(t) ## is not necessarily differentiable.
I don't think we need differentiation with respect to t, the statement of the theorem speaks about continuity with respect to x and t , and differentiation with respect to x.
 
  • #13
15,421
13,455
I haven't checked, that's why I said cautious. Anyway, there is something to learn from the exercise.
 
  • #14
37
0
I just realized that I was confusing differentiation with integration.

$$\int_0^x sin(t) dt = -cos(x) + 1$$

but

$$\frac{d}{dx} \int_0^x sin(t) dt = sin(x)$$

not ##F'(x) - F'(0) = sin(x) - sin(0)##
 
  • #15
37
0
Following wikipedia's article I was able to prove it. I just realized something else, in addition. The statement is that the interval contains the origin, not that the function itself contains the point (0,0).
 

Related Threads on Prove that y'' + y = f(x)

  • Last Post
Replies
21
Views
1K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
5
Views
1K
Replies
6
Views
1K
  • Last Post
Replies
4
Views
1K
Replies
4
Views
4K
M
Replies
3
Views
1K
  • Last Post
Replies
23
Views
3K
  • Last Post
Replies
7
Views
4K
Replies
8
Views
3K
Top