Prove that y'' + y = f(x)

  • Thread starter 0kelvin
  • Start date
In summary, the convolution y(x) = sin(x-t)f(t) - sin(x-0)f(0) is not differentiable at the boundaries where t varies. However, using the Leibniz rule and eliminating what disturbs, it can be shown that y'(x) = sin(x-t)f(t) and y''(x) = sin(x-0)f(0).
  • #1
50
5
Homework Statement
Let f be a continuous function in a interval I containing the origin and let

##y = y(x) = \int_0^x sin(x - t)f(t) dt##

Prove that ##y'' + y = f(x)## and ##y(0) = y'(0) = 0## for all x ##\in I##
Relevant Equations
...
I know how to solve ##\frac{d}{dx} \int_0^{x^2} sin(t^2) dt## and from the statement I got that f(0) = 0 because f contains the origin and is continuous.

I tried y'(x) = sin(x - x)f(x) - sin(x - 0)f(0) but that doesn't seem to look good.
 
Physics news on Phys.org
  • #2
You are not applying Leibniz's integral rule correctly.
https://en.wikipedia.org/wiki/Leibniz_integral_ruleAlso I don't understand how you got that f(0)=0 because f is continuous and contains the origin..

Just apply correctly Leibniz's integral rule to calculate y'(x) and y''(x)..
 
  • #3
We have a function ##g(x,t)=\sin(x-t)f(t)## as integrand, so that ##y(x)=\int_0^x g(x,t)\,dt##.
I would like to differentiate by ##\dfrac{d}{dx}g(x)= \displaystyle{ \int_0^x } \dfrac{\partial }{\partial x}g(x,t)\,dt## but I'm not 100% sure whether this is ok if the variable is still in the boundary.

What does your book say about differentiating convolutions?
Have a look: https://en.wikipedia.org/wiki/Convolution#Differentiation
 
  • #4
fresh_42 said:
I'm not 100% sure whether this is ok if the variable is still in the boundary.
It is not. The easy counter example is letting g(x,t) be a non-zero constant.
 
  • #5
fresh_42 said:
We have a function ##g(x,t)=\sin(x-t)f(t)## as integrand, so that ##y(x)=\int_0^x g(x,t)\,dt##.
I would like to differentiate by ##\dfrac{d}{dx}g(x)= \displaystyle{ \int_0^x } \dfrac{\partial }{\partial x}g(x,t)\,dt## but I'm not 100% sure whether this is ok if the variable is still in the boundary.

What does your book say about differentiating convolutions?
Have a look: https://en.wikipedia.org/wiki/Convolution#Differentiation
I am not so sure that the differentiation result for convolution can be applied when the boundaries of the convolution are functions of x.
 
  • #6
Delta2 said:
I am not so sure that the differentiation result for convolution can be applied when the boundaries of the convolution are functions of x.
It cannot.
 
  • Like
Likes Delta2
  • #7
Here is an idea, derived from the principle: eliminate what disturbs!
I'm not sure whether my first step is necessary, but it is what I did. We have
\begin{align*}
y(x)&=\int_0^x \sin(x-t)f(t)\,dt\\
&=-\int_0^x \sin(t-x)f(t)\,dt\\
&\stackrel{(sx=t-x)}{=} -\int_{-1}^0 \sin(sx)\left( xf(sx+x) \right)\,ds \\
&\stackrel{(g(sx)=xf(sx+x))}{=} -\int_{-1}^0 \sin(sx) g(sx) \, ds
\end{align*}
where now ##x## is just a constant and the integral basically the same as before, with constant boundary and a modified function ##g(tx):=xf(tx+x)## which is still defined on now ##[-1,0]## and continuous.
 
  • #8
You can of course do something like that, but why would you? It is a simple matter of correctly applying what was linked to already in #2.
 
  • #9
Yes well Leibniz rule correctly applied gives the answer in three lines, but it is interesting what @fresh_42 does because it removes the dependence of the boundary on x.
 
  • #10
I start to like the question, because how to deal with parameter integrals as well as this useful principle to attack what disturbs can be learnt. I assume that the proof of the Leibniz rule does exactly this. But one has to be cautious with Leibniz as ##f(t) ## is not necessarily differentiable.
 
  • #11
While that would be workable, I think it is more common to just apply the definition of the derivative.
 
  • #12
fresh_42 said:
I start to like the question, because how to deal with parameter integrals as well as this useful principle to attack what disturbs can be learnt. I assume that the proof of the Leibniz rule does exactly this. But one has to be cautious with Leibniz as ##f(t) ## is not necessarily differentiable.
I don't think we need differentiation with respect to t, the statement of the theorem speaks about continuity with respect to x and t , and differentiation with respect to x.
 
  • #13
I haven't checked, that's why I said cautious. Anyway, there is something to learn from the exercise.
 
  • #14
I just realized that I was confusing differentiation with integration.

$$\int_0^x sin(t) dt = -cos(x) + 1$$

but

$$\frac{d}{dx} \int_0^x sin(t) dt = sin(x)$$

not ##F'(x) - F'(0) = sin(x) - sin(0)##
 
  • #15
Following wikipedia's article I was able to prove it. I just realized something else, in addition. The statement is that the interval contains the origin, not that the function itself contains the point (0,0).
 
Back
Top