Can the Limit of Bayes' Risk Be Bounded by the Error in Conditional Expectation?

  • Thread starter Thread starter GabrielN00
  • Start date Start date
  • Tags Tags
    Limit
Click For Summary
SUMMARY

The discussion focuses on proving that the limit of Bayes' risk, represented as ##\lim_{n\rightarrow +\infty}\frac{\mathbb{E}(L_n)-L^*}{\sqrt{\mathbb{E}( ( \eta_n(X)-\eta(X) )^2 )}}=0##, can be bounded by the error in conditional expectation. The proof hinges on the existence of an ##\epsilon## that allows for bounding the expression ##\mathbb{E}(L_n)-L^*## using the empirical prediction error ##L_n## and the conditional expectation function ##\eta(x)=\mathbb{E}(Y|X=x)##. The goal is to demonstrate that ##L_n-L^*## converges to zero faster than the conditional expectation error ##L_2##.

PREREQUISITES
  • Understanding of Bayes' risk and its implications in statistical decision theory.
  • Familiarity with conditional expectation and its mathematical representation.
  • Knowledge of empirical prediction error and its significance in statistical learning.
  • Proficiency in limit theorems and convergence concepts in probability theory.
NEXT STEPS
  • Study the properties of Bayes' risk and its applications in decision-making processes.
  • Learn about the derivation and implications of conditional expectation in statistical models.
  • Investigate empirical prediction error metrics and their relevance in machine learning.
  • Explore limit theorems in probability, focusing on convergence in distribution and mean.
USEFUL FOR

Statisticians, data scientists, and researchers in machine learning who are interested in understanding the relationship between Bayes' risk and conditional expectation errors in predictive modeling.

GabrielN00

Homework Statement


Prove ##\lim_{n\rightarrow +\infty}\frac{\mathbb{E}(L_n)-L^*}{\sqrt{\mathbb{E}( ( \eta_n(X)-\eta(X) )^2 )}}=0##

if ##\eta_n## verifies ##\lim_{n\rightarrow\infty} \mathbb{E}( ( \eta_n(X)-\eta(X) )^2 )=0##

Homework Equations

The Attempt at a Solution



The idea might be to use ##g_n(x)=1_{\{\eta_n(x)>1/2\}}## because now ##\mathbb{E}(L_n)-L^*=2\mathbb{E}(|n(X)-1/2|1_{\{g(X)\neq g^*(X)\}})##. Now I want to show that ##\mathbb{E}(L_n)-L^*=2\mathbb{E}(|n(X)-1/2|1_{\{g(X)\neq g^*(X)\}})## can be bounded, this is, I want to prove there is an ##\epsilon## such such that ## \mathbb{E}(|n(X)-1/2|1_{\{g(X)\neq g^*(X)\}})\leq \mathbb{E}(|n(X)-\eta_n(X)|1_{\{g(X)\neq g^*(X)\}}1_{|\eta(X)-1/2|\leq \epsilon}1_{\eta(X)\neq1/2}) + \mathbb{E}(|n(X)-\eta_n(X)|1_{\{g(X)\neq g^*(X)\}}1_{|\eta(X)-1/2|> \epsilon})##.

If the latter can proved, and it is bounded, taking limits would show the limit of the problem is zero.I want to prove that such ##\epsilon## exists.
 
Physics news on Phys.org
I realize some details may be needed, that I didn't write in the first message. Here is some extra context, I would add it to the original post but I can't edit (or I cannot figure out how to do so). The object of the proof is to show that ##L_n-L^*## converges to ##0## faster than the error ##L_2##, the conditional expectation error. ##L^*## denotes Bayes' risk. The function ##\eta## is given by ##\eta(x)=\mathbb{E}(Y|X=x)## and ##L_n## is the empirical prediction error.
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
3
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
17
Views
3K
Replies
4
Views
2K