Sum of independent uniform distribution conditional on uniform

Click For Summary

Homework Help Overview

The discussion revolves around the properties of the sum of two independent uniformly distributed random variables, specifically focusing on the conditional mean and variance given one of the variables. The original poster seeks to understand how to derive these properties when X and Y are uniformly distributed on the interval [-1, 1].

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Problem interpretation

Approaches and Questions Raised

  • Participants explore the application of the convolution integral to find the distribution of the sum of independent uniform variables. There are attempts to derive the conditional mean and variance, with some participants suggesting that the distribution of the sum is triangular. Questions arise regarding the use of Dirac's delta function in the context of conditional probabilities and the integration process involved.

Discussion Status

Several participants have provided insights and alternative methods for approaching the problem. There is an ongoing exploration of the implications of conditioning on one of the variables, and some participants express a desire for clarification on specific mathematical steps. The discussion remains open, with no explicit consensus reached.

Contextual Notes

Participants note the challenge of applying known results from normally distributed variables to the case of uniformly distributed variables. There is also mention of the need for rigorous justification of certain steps in the derivation process, particularly concerning the conditional probability density functions.

grossgermany
Messages
53
Reaction score
0

Homework Statement


Let X and Y be independent and normal, then we know that
It must be the case that X+Y and X are jointly normal
Therefore we can apply the projection theorem:
which states that if A and B are jointly normal then VAR(A|B)=VAR(B)-\rho^2VAR(B) , apply the theorem to A=X+Y, B=Y to find
VAR(X+Y|X)

There is a similar procedure of finding E(X+Y|X)
I know how to do the above. However, what I don't know is what if X and Y are independent but each are UNIFORMLY distributed on [-1,1]
What is:
1.VAR(X+Y|X)
2.E(X+Y|X)

Homework Equations


The Attempt at a Solution


Homework Statement


Homework Equations


The Attempt at a Solution


Homework Statement


Homework Equations


The Attempt at a Solution

 
Last edited:
Physics news on Phys.org
this may help (or may not), with bits borrowed form wiki
http://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables
though it should be independent of the probability distribution used

let Z = X + Y, and consider the joint probability density f_{X,Y,Z}(x,y,z) the probability density of z is given by
f_{Z}(z) = \int f_{X,Y,Z}(x,y,z)dxdy

you can re-write the joint distribution using conditional probabilities & the independence of X & Y, following that through leads to the convolution
f_{Z}(z) = \int f_{X,Y,Z}(x,y,z)dxdy = \int f_{X}(x)f_{Y}(y|x)f_{Z}(z|x,y)dxdy = \int f_{X}(x)f_{Y}(y)\delta(z-x-y)dxdy = \int f_{X}(x)f_{Y}(z-x)dx

The conditional probability given X = x_0 will be
f_{Z}(z|x_0) = \int f_{X}(x|x_0)f_{Y}(y|x_0)f_{Z}(z|x_0,y)dxdy = \int \delta(x-x_0)f_{Y}(y)\delta(z-x_0-y)dxdy=\int f_{Y}(y)\delta(z-x_0-y)dy= f_{Y}(z-x_0)

Then take the expectation
E(Z|x_0) = \int z f(z|x_0)dz = \int z f_{Y}(z-x_0)dz

and similar for the variance
 
Thank you for your reply.
Unfortunately I am already familiar with this topic with regard to normally distibuted variables.
What I need is the mean and variance of sum of two independent Uniformly distributed variables conditional on one of the uniformly distributed variable.
I know that X+Y should be a triangular distributed, if X and Y are independent uniformly distributed.
However, what is
E(x+y|x)
Var(x+y|x)
 
Last edited:
the derivation does not assume normal variables & should work for any distribution

the convolution integral will give you the triangular distribution

the last integral should give you the expectation with conditional X=x_0 (try it, it should be pretty simple)

then you should be able to use the conditional distiribution for the finding the variance as well

Thats rigorous... but if you just think about what is happening when X=x_0 is known you should be able to guess the mean & variance based on the distribution of Y easily enough
 
Last edited:
PS, hint - as X is a set value, the only random variable is Y, so in effect Z = constant + Y...
 
Last edited:
Thank you very much for your reply. I mostly understand what you wrote except 3 things.
1. Why is the conditional pdf of (z given x,y) equal to dirac's delta function of z-(x+y). Wikipedia just says they are trivially equal. Yes it's true that z needs to be equal to x+y, so you can say there is zero probabiltiy z not equal to x+y. But somehow I feel this is just too much intuition and not rigorous enough.
2. When the double integral regard to x y become a single integral to x, why does this work out to be fx(x)fy(z-x) ?

Yes y=z-x but don't we need
integral[fx(x)integral{fy(z-x)d(-x)}dx]. The d(-x) is due to the fact that dy=d(z-x)=-dx
3. How would we find E(X|X+Y)
 
Last edited:
grossgermany said:
Thank you very much for your reply. I mostly understand what you wrote except 3 things.
1. Why is the conditional pdf of (z given x,y) equal to dirac's delta function of z-(x+y). Wikipedia just says they are trivially equal. Yes it's true that z needs to be equal to x+y, so you can say there is zero probabiltiy z not equal to x+y. But somehow I feel this is just too much intuition and not rigorous enough.
it can only be that, given X=x & Y=y, Z =X+Y is totally determined to be Z=x+y, hence
f_Z(z=x+y|X=x,Y=y) = \delta(z-x+y)
note this intgrates to one as it must and satifies the constraint

grossgermany said:
2. When the double integral regard to x y become a single integral to x, why does this work out to be fx(x)fy(z-x) ? Yes y=z-x but don't we need integral[fx(x)integral{fy(z-x)d(-x)}dx]. The d(-x) is due to the fact that dy=d(z-x)=-dx
the following integral is contracted by integrating over all y, integrating over the delta function sets y = z-x
= \int \int f_{X}(x)f_{Y}(y)\delta(z-x-y)dxdy = \int f_{X}(x)f_{Y}(z-x)dx

you could similarly choose to integrate over x first & get
= \int \int f_{X}(x)f_{Y}(y)\delta(z-x-y)dxdy = \int \int f_{X}(x)f_{Y}(y)\delta(z-x-y)dydx = \int f_{X}(z-y)f_{Y}(y)dy

the results are equivalent as they must be, we choose to integrate over Y first as it makes life easier later on with the constrant on x

grossgermany said:
3. How would we find E(X|X+Y)
this takes a little more thought as i assume you mean E(X|Z), with Z = X+Y? you would need to find the conditional probability distribution

for the previous questions, say you have a random variable X with
E(X) = \mu
VAR(X) = \sigma^2

note that for a constant a
E(X+a) = \mu +a
VAR(X+a) = \sigma^2
 
I am able to completely understand your post now. Thank you. But I tried to derive the entire thing in your early post but with different objective:
Give X and Y are indpendent uniform
Find
E(X|Z=X+Y)
Var(X|Z=X+Y)
and I am stuck in the imitation of your proof. Simply because fx(x|z) is no longer fx(x).
Would you please show me how to redo your early post in this scenario?
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
9
Views
4K
  • · Replies 12 ·
Replies
12
Views
2K
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
6
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K