- #1
scottneh
- 3
- 0
Hello, I am preparing for a screening exam and I'm trying to figure out some old problems that I have been given.
Given:
Suppose f is contained in L1([a,b])
Prove for almost everywhere x is contained in [a,b]
limit as h goes to 0+, int (abs(f(x+t)+f(x-t)-2f(x)))dt = 0
Initially I thought that I could argue this as if t is an offset to the function and say that as h goes to zero the t would go to zero and clearly f(x+0)+f(x-0) = 2f(x), then 2f(x)-2f(x)=0
I think I have to use dominated convergence theorem but I'm not sure how to apply it.
Can someone please help me get started?
Thanks
Given:
Suppose f is contained in L1([a,b])
Prove for almost everywhere x is contained in [a,b]
limit as h goes to 0+, int (abs(f(x+t)+f(x-t)-2f(x)))dt = 0
Initially I thought that I could argue this as if t is an offset to the function and say that as h goes to zero the t would go to zero and clearly f(x+0)+f(x-0) = 2f(x), then 2f(x)-2f(x)=0
I think I have to use dominated convergence theorem but I'm not sure how to apply it.
Can someone please help me get started?
Thanks
Last edited: