Proving Integral Convergence with L1 Functions

scottneh
Messages
2
Reaction score
0
Hello, I am preparing for a screening exam and I'm trying to figure out some old problems that I have been given.

Given:

Suppose f is contained in L1([a,b])

Prove for almost everywhere x is contained in [a,b]

limit as h goes to 0+, int (abs(f(x+t)+f(x-t)-2f(x)))dt = 0

Initially I thought that I could argue this as if t is an offset to the function and say that as h goes to zero the t would go to zero and clearly f(x+0)+f(x-0) = 2f(x), then 2f(x)-2f(x)=0

I think I have to use dominated convergence theorem but I'm not sure how to apply it.

Can someone please help me get started?

Thanks
 
Last edited:
Physics news on Phys.org
You could start by defining h.
 
The problem does not state a definition for h.

After searching around on the net I found the exact same equation, namely:

f(x+t)+f(x-t)-2f(x) in regards to Fourier analysis but it did't quite cover what the problem is stating.

How should I define h?

Thanks
 
Back
Top