Convergence in L^2 Norm: Understanding Subsequence Implications

  • Thread starter Thread starter AxiomOfChoice
  • Start date Start date
  • Tags Tags
    Convergence Norm
AxiomOfChoice
Messages
531
Reaction score
1
Suppose there exists a sequence f_n of square-integrable functions on \mathbb R such that f_n(x) \to f(x) in the L^2-norm with x \ f_n(x) \to g(x), also in the L^2-norm. We know from basic measure theory that there's a subsequence f_{n_k} with f_{n_k}(x) \to f(x) for a.e. x. But my professor seems to be claiming that this somehow implies x \ f_{n_k}(x) \to g(x) for a.e. x. I don't see why this is. Obviously, we know that x \ f_{m_k} \to g a.e. for SOME subsequence of f_n...but how do we know it works for the SAME subsequence? Can someone offer some guidance? Thanks!
 
Physics news on Phys.org
If I understand you correctly, there's a set E such that ##\mu(E)=0## and ##xf_n(x)\to g(x)## for all x in ##E^c##. This means that for all ##x\in E^c##, ##\langle xf_n(x)\rangle_{n=1}^\infty## is a convergent sequence in ##\mathbb R##, and as you know (or can easily prove), every subsequence of a convergent sequence in ##\mathbb R## converges to the limit of the sequence.
 
Fredrik said:
If I understand you correctly, there's a set E such that ##\mu(E)=0## and ##xf_n(x)\to g(x)## for all x in ##E^c##. This means that for all ##x\in E^c##, ##\langle xf_n(x)\rangle_{n=1}^\infty## is a convergent sequence in ##\mathbb R##, and as you know (or can easily prove), every subsequence of a convergent sequence in ##\mathbb R## converges to the limit of the sequence.

Well, if we have f_{n_k}(x) \to f(x) off of a set E\subset \mathbb R with \mu(E) = 0, then |f_{n_k}(x) - f(x)| \to 0 as k\to \infty whenever x\notin E. But I want to somehow show that this implies the existence of a E'\subset \mathbb R (which may or may not be the same as E) with \mu(E') = 0 such that |x \ f_{n_k}(x) - g(x)| \to 0 as k\to \infty whenever x\notin E'. How in the world is one supposed to get from the first statement to the one we want to prove, if the only other thing you know is that \| x \ f_n(x) - g(x) \|_{L^2(\mathbb R)} \to 0 as n\to \infty?

I hope that clarifies my question a bit!
 
OK, I wasn't paying enough attention to when you were using the L^2-norm and when you were just talking about convergence almost everywhere. I will think about it.
 
Isn't it obvious that if f_n converges both to f and g in L^2 that then f=g a.e.?? That would imply it.
 
Hehe, after my last post yesterday, I realized that I was much too tired to do any math. I decided to give it another shot "tomorrow", i.e. today, in the unlikely event that micromass wouldn't already have posted the solution. I should have realized that there was no chance that he wouldn't already have done that. :smile:
 
Haha. Next time I'll let you finish it up!
 
But it's xf_n(x) that's converging to g, and not f_n.

Something does seem fishy about the argument in the OP. Maybe we need more context?
 
Oh, I see. I misread there.

Anyway. If xf_n(x)\rightarrow g in L^2, then the xf_{n_k}\rightarrow g in L^2. Thus there is a subsequence xf_{n_{k_l}} that converges to g a.e.

Evidently, the sequence xf_{n_k}(x) converges a.e. (to xf(x)). And since a subsequence converges to g, it means that the sequence xf_{n_k}(x) converges to g a.e.

Did I do something stupid?
 
  • #10
That works!
 
  • #11
micromass said:
Haha. Next time I'll let you finish it up!
Oh, don't worry about that. I don't mind at all. I would probably need 7 hours to do what you can do in 7 minutes anyway. :smile:
 

Similar threads

Replies
1
Views
1K
Replies
2
Views
2K
Replies
21
Views
2K
Replies
9
Views
2K
Replies
4
Views
2K
Back
Top