Conditional expectation and Least Squares Regression

In summary, the conversation is about two questions regarding conditional expectation with respect to polynomial OLS. The first question asks for the proof of OLS approaching the conditional expectation as n approaches infinity. The second question asks for any relationship between the projections of f(X_T) on two different vector spaces. The speaker also mentions that X_t is a Markov process and may have some interesting results for the first question. They also ask for any help or ideas on the second question.
  • #1
piolo
2
0
Hello everybody,

I have two questions on conditional expectation w.r.t (Polynomial) OLS:
Let X_t be a random variable and F_t the associated filtration, Vect_n{X_t} the vector space spanned by the polynomials of order {i, i<=n }, f(.) one function with enough regularity. I am wondering how we can prove the following statements are true/false:

(feel free to add assumptions)

1. OLS( f(X_T), Vect_n{X_t} ) -> E( f(X_T) | F_t ), when n-> \infty
2. Norm_L2{ E( f(X_T) | F_T ) - OLS( (X_T), Vect_n{X_T} ) } >= Norm_L2{ E( f(X_T) | F_t ) - OLS( (X_T), Vect_n{X_t} ) }

For the first one, suppose X_t is Markov + Stone-Weierstrass + projection, we may have something interesting. But for the second one, I don't have any idea...

Any help? Thx.
 
Physics news on Phys.org
  • #2
Can anyone tell me if there is some (or no) relationship between the projection of f(X_T) on Vector space{X_t^0, X_t^1, ..., X_t^m} and the projection of f(X_T) on Vector space{X_T^0, X_T^1, ..., X_T^m}? Where X_t is a markov process.(or any other appropriate process).
 

1. What is conditional expectation?

Conditional expectation is a statistical concept that represents the expected value of a random variable given the knowledge of another random variable. It is calculated as the sum of the product of the conditional probability of the event and the value of the random variable.

2. What is the purpose of least squares regression?

Least squares regression is a statistical method used to find the best fitting line for a set of data points. Its purpose is to minimize the sum of squared differences between the actual data points and the predicted values on the line, in order to determine the relationship between two variables and make predictions based on that relationship.

3. How is conditional expectation used in least squares regression?

In least squares regression, conditional expectation is used to calculate the predicted values on the regression line. By taking the conditional expectation of the dependent variable given the independent variable, the line of best fit can be determined.

4. What is the difference between simple and multiple least squares regression?

In simple least squares regression, there is only one independent variable, while in multiple least squares regression, there are multiple independent variables. This allows for a more complex relationship between the dependent and independent variables to be modeled and predicted.

5. How is least squares regression used in real-world applications?

Least squares regression is widely used in various fields such as economics, finance, and engineering, to analyze and predict relationships between variables. It is commonly used for forecasting future trends, making predictions, and identifying correlations between variables.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
454
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
23
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
0
Views
130
  • Calculus and Beyond Homework Help
Replies
2
Views
345
  • Calculus
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
7K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
2K
Back
Top