Determine the mean square error of a simple distribution

Click For Summary

Homework Help Overview

The discussion revolves around calculating the mean square error of a simple distribution, specifically focusing on the expected quadratic prediction error in the context of conditional expectations and uniform distributions.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the calculation of expected values and variances, questioning the correctness of their results compared to a textbook answer. They explore the implications of the conditional probability distribution and the role of constants in their calculations.

Discussion Status

Some participants express confidence in their calculations, while others suggest that the textbook may have overlooked certain factors. There is an ongoing exploration of the definitions and implications of the "best predictor" in the context of mean squared error.

Contextual Notes

Participants note the potential for missing information or assumptions in the problem setup, particularly regarding the constant used in the calculations and the interpretation of the term "best predictor."

psie
Messages
315
Reaction score
40
Homework Statement
Consider $$f_{X,Y}(x,y)=\begin{cases} c,&\text{for } x,y\geq0, x+y\leq 1,\\ 0,&\text{otherwise}.\end{cases},$$where ##c## is some constant to be determined. Determine ##E(Y\mid X=x)## and ##E(X\mid Y=y)##. Moreover, determine the expected quadratic prediction error ##E(Y-d(X))^2## for the best predictor ##d(X)## of ##Y## based on ##X##.
Relevant Equations
The best predictor is the conditional expectation, i.e. ##h(X)=E(Y\mid X)##.
What troubles me about this exercise is that I don't get the answer that the book gets regarding the expected quadratic prediction error.

##c## is determined by $$1=\int_0^1\int_0^{1-x} c\,dydx=c\int_0^1(1-x)\,dx=c\left[-\frac{(1-x)^2}{2}\right]_0^1=\frac{c}2,$$so ##c=2##. The marginal density of ##X## is $$f_X(x)=\int_0^{1-x}2\,dy=2(1-x),\quad 0<x<1.$$And the conditional one is $$f_{Y\mid X=x}(y)=\frac{f_{X,Y}(x,y)}{f_X(x)}=\frac2{2(1-x)}=\frac1{1-x},\quad 0<y<1-x.$$Finally, $$E(Y\mid X=x)=\int_0^{1-x}y\cdot\frac1{1-x}\,dy=\frac1{1-x}\left[\frac{y^2}{2}\right]_0^{1-x}=\frac{(1-x)^2}{2(1-x)}=\frac{1-x}{2}.$$ By symmetry, ##E(X\mid Y=y)=\frac{1-y}{2}##.

I am confident everything is correct up to this point, as this is actually an example in the book and done exactly the same way. But the next part is omitted in the book, i.e. determining the expected quadratic prediction error ##E(Y-E(Y\mid X))^2##, where ##E(Y\mid X)=(1-X)/2##. We can simplify as follows \begin{align*}E(Y-E(Y\mid X))^2&=E(Y-(1-X)/2)^2 \\ &=E(Y^2+(1-X)^2/4-Y(1-X)) \\ &=E\left(Y^2+\frac14-\frac{X}{2}+\frac{X^2}{4}-Y+YX\right).\end{align*} Since ##X,Y## have the exact same distribution, we can replace ##Y## with ##X## except in the last term I believe, i.e. except in ##YX##. So we have $$E\left(\frac{5X^2}{4}+\frac14-\frac{3X}{2}+YX\right)=\frac54E(X^2)+\frac14-\frac32E(X)+E(YX).$$ I used WolframAlpha to compute the three expectations on the right-hand side of this last equation:

##E(X)=\frac13##: first integral
##E(X^2)=\frac16##: second integral
##E(XY)=\frac1{12}##: third integral

Therefor $$E(Y-(1-X)/2)^2 =\frac54\cdot\frac16+\frac14-\frac32\cdot\frac13+\frac1{12}=\frac1{24}.$$The book gets ##\frac1{48}##.
 
  • Like
Likes   Reactions: Gavran and docnet
Physics news on Phys.org
I have tried the calculation a couple of different ways, and get the same answer as you. I suspect they may have forgotten (as I almost did) to include the constant c in the calculation.
 
  • Like
Likes   Reactions: psie
Your result is correct.
The conditional probability distribution of Y given X is a continuous uniform distribution. You can check the result by using the next expression $$ E((Y-E(Y|X))^2) = E(Var(Y|X)) $$ and by using the formula for calculating a variance of a continuous uniform distribution.
 
  • Like
Likes   Reactions: psie
I assume the " Best predictor" is some estimator. But then what would " best" mean here, as there are estimators that may have, e.g. , minimal variance, be consistent, etc.
 
WWGD said:
I assume the " Best predictor" is some estimator. But then what would " best" mean here, as there are estimators that may have, e.g. , minimal variance, be consistent, etc.
Here we are talking about the predictor of ## Y ## based on ## X ## with the lowest mean squared error among all possible estimators of ## Y ## based on ## X ##.
 
  • Like
Likes   Reactions: psie

Similar threads

Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
9
Views
2K
Replies
12
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
11
Views
3K
Replies
2
Views
2K
Replies
7
Views
1K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 19 ·
Replies
19
Views
4K