Conditional expectation and Least Squares Regression

Click For Summary
SUMMARY

The discussion centers on the relationship between conditional expectation and Ordinary Least Squares (OLS) regression in the context of polynomial approximations. Specifically, it examines two statements regarding the convergence of OLS estimates to conditional expectations as the polynomial order approaches infinity. The first statement asserts that OLS of a function of a Markov process converges to the conditional expectation given the filtration, while the second compares the L2 norms of the differences between these estimates and their respective conditional expectations. The participants explore the implications of the Markov property and the Stone-Weierstrass theorem in this context.

PREREQUISITES
  • Understanding of Ordinary Least Squares (OLS) regression
  • Familiarity with conditional expectation and filtration in probability theory
  • Knowledge of polynomial approximation and the Stone-Weierstrass theorem
  • Concept of Markov processes and their properties
NEXT STEPS
  • Study the implications of the Stone-Weierstrass theorem in polynomial approximation
  • Explore the properties of Markov processes and their applications in regression analysis
  • Learn about the convergence of OLS estimates to conditional expectations in statistical theory
  • Investigate the relationship between projections in vector spaces and conditional expectations
USEFUL FOR

Statisticians, data scientists, and researchers in econometrics or quantitative finance who are working with regression analysis and conditional expectations in stochastic processes.

piolo
Messages
2
Reaction score
0
Hello everybody,

I have two questions on conditional expectation w.r.t (Polynomial) OLS:
Let X_t be a random variable and F_t the associated filtration, Vect_n{X_t} the vector space spanned by the polynomials of order {i, i<=n }, f(.) one function with enough regularity. I am wondering how we can prove the following statements are true/false:

(feel free to add assumptions)

1. OLS( f(X_T), Vect_n{X_t} ) -> E( f(X_T) | F_t ), when n-> \infty
2. Norm_L2{ E( f(X_T) | F_T ) - OLS( (X_T), Vect_n{X_T} ) } >= Norm_L2{ E( f(X_T) | F_t ) - OLS( (X_T), Vect_n{X_t} ) }

For the first one, suppose X_t is Markov + Stone-Weierstrass + projection, we may have something interesting. But for the second one, I don't have any idea...

Any help? Thx.
 
Physics news on Phys.org
Can anyone tell me if there is some (or no) relationship between the projection of f(X_T) on Vector space{X_t^0, X_t^1, ..., X_t^m} and the projection of f(X_T) on Vector space{X_T^0, X_T^1, ..., X_T^m}? Where X_t is a markov process.(or any other appropriate process).
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 23 ·
Replies
23
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
0
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K