Convergence in Measure: Understanding and Proving Almost Everywhere Convergence

  • Context: Graduate 
  • Thread starter Thread starter Fredrik
  • Start date Start date
  • Tags Tags
    Convergence Measure
Click For Summary
SUMMARY

This discussion centers on the concept of almost everywhere convergence in measure theory, specifically within the context of a measure space (X, Σ, μ). It establishes that if a sequence of measurable functions converges in measure to two functions f and g, then f equals g almost everywhere. The proof involves demonstrating that the measure of the set where f and g differ is zero, utilizing properties of limits and subsequences. The conversation highlights the importance of this concept in both analysis and probability theory.

PREREQUISITES
  • Understanding of measure spaces, specifically (X, Σ, μ).
  • Familiarity with measurable functions and their properties.
  • Knowledge of limits and convergence concepts in mathematical analysis.
  • Basic principles of probability theory, particularly convergence in probability.
NEXT STEPS
  • Study the proof of the uniqueness of limits in measure theory.
  • Explore the concept of convergence in probability and its relationship to almost everywhere convergence.
  • Learn about subsequences and their role in establishing convergence properties.
  • Investigate the implications of measure theory in advanced probability topics, such as the law of large numbers.
USEFUL FOR

Mathematicians, statisticians, and students of advanced calculus or analysis who are looking to deepen their understanding of measure theory and its applications in probability.

Fredrik
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
Messages
10,876
Reaction score
423
Not sure where to post about measure theory. None of the forums seems quite right.

Suppose that ##(X,\Sigma,\mu)## is a measure space. A sequence ##\langle f_n\rangle_{n=1}^\infty## of almost everywhere real-valued measurable functions on X is said to converge in measure to a measurable function f, if for all ε>0, ##\mu(\{x\in X:|f_n(x)-f(x)|\geq\varepsilon\})\to 0##.

The book says that it's easily seen that if ##\langle f_n\rangle## converges in measure to two measurable functions f and g, then f=g almost everywhere. I don't see it.

I understand that I need to prove that ##\mu(\{x\in X: |f(x)-g(x)|>0\})=0##, but I don't even see how to begin.
 
Physics news on Phys.org
Note that

|f(x)-g(x)|\leq |f(x)-f_n(x)|+|f_n(x)-g(x)|

So if |f(x)-g(x)|\geq \varepsilon, then also

|f(x)-f_n(x)|+|f_n(x)-g(x)|\geq \varepsilon.

But this must mean that either |f(x)-f_n(x)|\geq \varepsilon/2 or |f_n(x)-g(x)|\geq \varepsilon/2 (proceed by contradiction: assume that neither holds). So

\{x\in X~\vert~|f(x)-g(x)|\geq \varepsilon\}\subseteq \{x\in X~\vert~|f(x)-f_n(x)|\geq \varepsilon/2\}\cup \{x\in X~\vert~|g(x)-f_n(x)|\geq \varepsilon/2\}

Taking \mu of both sides yields

\mu\{x\in X~\vert~|f(x)-g(x)|\geq \varepsilon\}\leq \mu\{x\in X~\vert~|f(x)-f_n(x)|\geq \varepsilon/2\}+ \mu\{x\in X~\vert~|g(x)-f_n(x)|\geq \varepsilon/2\}

I think you can take it from here. It's just a limiting argument now.

Another nice argument (but a little more difficult to prove) is to show that the sequence has a subsequence which converges a.e. Now f=g a.e. by uniqueness of the limit a.e.
 
Fredrik said:
Not sure where to post about measure theory. None of the forums seems quite right.

From wikipedia:

"Calculus (Latin, calculus, a small stone used for counting) is a branch of mathematics focused on limits, functions, derivatives, integrals, and infinite series. This subject constitutes a major part of modern mathematics education."

and:

"Mathematical analysis, which mathematicians refer to simply as analysis, has its beginnings in the rigorous formulation of infinitesimal calculus. It is a branch of pure mathematics that includes the theories of differentiation, integration and measure, limits, infinite series,[1] and analytic functions."Seems to me this fits Calculus & Analysis.
 
I like Serena said:
From wikipedia:

"Calculus (Latin, calculus, a small stone used for counting) is a branch of mathematics focused on limits, functions, derivatives, integrals, and infinite series. This subject constitutes a major part of modern mathematics education."

and:

"Mathematical analysis, which mathematicians refer to simply as analysis, has its beginnings in the rigorous formulation of infinitesimal calculus. It is a branch of pure mathematics that includes the theories of differentiation, integration and measure, limits, infinite series,[1] and analytic functions."


Seems to me this fits Calculus & Analysis.

I prefer this in the probability forum. Measure theory (and convergence in probability, like here) are really important probability concepts. So it might make sense that a probabilist knows more about them then an analyst (it certainly is my experience that this is the case).
 
Ah, I think I understand it now. Your argument tells us that for all ##\varepsilon,\varepsilon'>0##, we have ##\mu(\{x\in X:|f(x)-g(x)|\geq\varepsilon)<\varepsilon'##. This implies that for all ##\varepsilon>0##, we have ##\mu(\{x\in X:|f(x)-g(x)|\geq\varepsilon\})=0##. We still have to do something tricky for the final step, something like this:

$$\big\{x\in X:|f(x)-g(x)|>0\big\}=\bigcup_{n=1}^\infty\bigg\{x\in X:|f(x)-g(x)|\geq\frac{1}{n}\bigg\}$$
$$\mu\big(\big\{x\in X:|f(x)-g(x)|>0\big\}\big) \leq\sum_{n=1}^\infty\mu\bigg( \bigg\{x\in X:|f(x)-g(x)|\geq\frac{1}{n}\bigg\} \bigg)=0$$ Thank you.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K