Sum of independent uniform distribution conditional on uniform

In summary, if X and Y are independent and normally distributed, then X+Y and X are jointly normal. This allows us to apply the projection theorem, which states that if A and B are jointly normal, then VAR(A|B) = VAR(B) - ρ^2VAR(B). We can apply this theorem to find the variance of X+Y given X, as well as a similar procedure for finding the mean. However, when X and Y are independent and uniformly distributed, the derivation is not dependent on the probability distribution and can be applied to find the expectation and variance of X+Y given X. This can be done by finding the joint probability density function and using conditional probabilities to find the expectation and variance.
  • #1
grossgermany
53
0

Homework Statement


Let X and Y be independent and normal, then we know that
It must be the case that X+Y and X are jointly normal
Therefore we can apply the projection theorem:
which states that if A and B are jointly normal then VAR(A|B)=VAR(B)-[tex]\rho[/tex]^2VAR(B) , apply the theorem to A=X+Y, B=Y to find
VAR(X+Y|X)

There is a similar procedure of finding E(X+Y|X)
I know how to do the above. However, what I don't know is what if X and Y are independent but each are UNIFORMLY distributed on [-1,1]
What is:
1.VAR(X+Y|X)
2.E(X+Y|X)

Homework Equations


The Attempt at a Solution


Homework Statement


Homework Equations


The Attempt at a Solution


Homework Statement


Homework Equations


The Attempt at a Solution

 
Last edited:
Physics news on Phys.org
  • #2
this may help (or may not), with bits borrowed form wiki
http://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables
though it should be independent of the probabilty distribution used

let Z = X + Y, and consider the joint probability density [itex] f_{X,Y,Z}(x,y,z) [/itex] the probability density of z is given by
[tex]f_{Z}(z) = \int f_{X,Y,Z}(x,y,z)dxdy [/tex]

you can re-write the joint distribution using conditional probabilities & the independence of X & Y, following that through leads to the convolution
[tex]f_{Z}(z) = \int f_{X,Y,Z}(x,y,z)dxdy = \int f_{X}(x)f_{Y}(y|x)f_{Z}(z|x,y)dxdy = \int f_{X}(x)f_{Y}(y)\delta(z-x-y)dxdy = \int f_{X}(x)f_{Y}(z-x)dx[/tex]

The conditional probability given [itex] X = x_0 [/itex] will be
[tex]f_{Z}(z|x_0) = \int f_{X}(x|x_0)f_{Y}(y|x_0)f_{Z}(z|x_0,y)dxdy = \int \delta(x-x_0)f_{Y}(y)\delta(z-x_0-y)dxdy=\int f_{Y}(y)\delta(z-x_0-y)dy= f_{Y}(z-x_0)[/tex]

Then take the expectation
[tex] E(Z|x_0) = \int z f(z|x_0)dz = \int z f_{Y}(z-x_0)dz [/tex]

and similar for the variance
 
  • #3
Thank you for your reply.
Unfortunately I am already familiar with this topic with regard to normally distibuted variables.
What I need is the mean and variance of sum of two independent Uniformly distributed variables conditional on one of the uniformly distributed variable.
I know that X+Y should be a triangular distributed, if X and Y are independent uniformly distributed.
However, what is
E(x+y|x)
Var(x+y|x)
 
Last edited:
  • #4
the derivation does not assume normal variables & should work for any distribution

the convolution integral will give you the triangular distribution

the last integral should give you the expectation with conditional X=x_0 (try it, it should be pretty simple)

then you should be able to use the conditional distiribution for the finding the variance as well

Thats rigorous... but if you just think about what is happening when X=x_0 is known you should be able to guess the mean & variance based on the distribution of Y easily enough
 
Last edited:
  • #5
PS, hint - as X is a set value, the only random variable is Y, so in effect Z = constant + Y...
 
Last edited:
  • #6
Thank you very much for your reply. I mostly understand what you wrote except 3 things.
1. Why is the conditional pdf of (z given x,y) equal to dirac's delta function of z-(x+y). Wikipedia just says they are trivially equal. Yes it's true that z needs to be equal to x+y, so you can say there is zero probabiltiy z not equal to x+y. But somehow I feel this is just too much intuition and not rigorous enough.
2. When the double integral regard to x y become a single integral to x, why does this work out to be fx(x)fy(z-x) ?

Yes y=z-x but don't we need
integral[fx(x)integral{fy(z-x)d(-x)}dx]. The d(-x) is due to the fact that dy=d(z-x)=-dx
3. How would we find E(X|X+Y)
 
Last edited:
  • #7
grossgermany said:
Thank you very much for your reply. I mostly understand what you wrote except 3 things.
1. Why is the conditional pdf of (z given x,y) equal to dirac's delta function of z-(x+y). Wikipedia just says they are trivially equal. Yes it's true that z needs to be equal to x+y, so you can say there is zero probabiltiy z not equal to x+y. But somehow I feel this is just too much intuition and not rigorous enough.
it can only be that, given X=x & Y=y, Z =X+Y is totally determined to be Z=x+y, hence
[tex] f_Z(z=x+y|X=x,Y=y) = \delta(z-x+y) [/tex]
note this intgrates to one as it must and satifies the constraint

grossgermany said:
2. When the double integral regard to x y become a single integral to x, why does this work out to be fx(x)fy(z-x) ? Yes y=z-x but don't we need integral[fx(x)integral{fy(z-x)d(-x)}dx]. The d(-x) is due to the fact that dy=d(z-x)=-dx
the following integral is contracted by integrating over all y, integrating over the delta function sets y = z-x
[tex] = \int \int f_{X}(x)f_{Y}(y)\delta(z-x-y)dxdy = \int f_{X}(x)f_{Y}(z-x)dx[/tex]

you could similarly choose to integrate over x first & get
[tex] = \int \int f_{X}(x)f_{Y}(y)\delta(z-x-y)dxdy = \int \int f_{X}(x)f_{Y}(y)\delta(z-x-y)dydx = \int f_{X}(z-y)f_{Y}(y)dy[/tex]

the results are equivalent as they must be, we choose to integrate over Y first as it makes life easier later on with the constrant on x

grossgermany said:
3. How would we find E(X|X+Y)
this takes a little more thought as i assume you mean E(X|Z), with Z = X+Y? you would need to find the conditional probability distribution

for the previous questions, say you have a random variable X with
[tex] E(X) = \mu [/tex]
[tex] VAR(X) = \sigma^2 [/tex]

note that for a constant a
[tex] E(X+a) = \mu +a [/tex]
[tex] VAR(X+a) = \sigma^2 [/tex]
 
  • #8
I am able to completely understand your post now. Thank you. But I tried to derive the entire thing in your early post but with different objective:
Give X and Y are indpendent uniform
Find
E(X|Z=X+Y)
Var(X|Z=X+Y)
and I am stuck in the imitation of your proof. Simply because fx(x|z) is no longer fx(x).
Would you please show me how to redo your early post in this scenario?
 

What is the sum of independent uniform distribution conditional on uniform?

The sum of independent uniform distribution conditional on uniform is a statistical concept that refers to the sum of two or more independent uniform random variables that are conditional on another uniform random variable. It is often used in probability theory and can be calculated using mathematical formulas.

How do you calculate the sum of independent uniform distribution conditional on uniform?

The sum of independent uniform distribution conditional on uniform can be calculated by using the formula: E(X + Y | Z) = E(X | Z) + E(Y | Z), where X and Y are independent uniform random variables and Z is the conditional uniform random variable. This formula is based on the properties of expectation and conditional probability.

What are some real-world applications of the sum of independent uniform distribution conditional on uniform?

The sum of independent uniform distribution conditional on uniform has many practical applications, such as in finance and risk management. For example, it can be used to model the distribution of stock prices or the likelihood of default in credit risk analysis. It is also commonly used in simulation and modeling in various fields, including engineering, physics, and biology.

What are the assumptions for the sum of independent uniform distribution conditional on uniform?

There are a few key assumptions that need to be met for the sum of independent uniform distribution conditional on uniform to be valid. These include the independence of the uniform random variables, the uniformity of the conditional random variable, and the continuity of the probability distributions. Violation of these assumptions can lead to inaccurate results.

How does the sum of independent uniform distribution conditional on uniform differ from other types of distributions?

The sum of independent uniform distribution conditional on uniform is unique in that it involves the combination of two or more uniform random variables that are conditional on another uniform random variable. This is different from other types of distributions, such as the normal distribution, which only involves one random variable. Additionally, due to its specific properties, the sum of independent uniform distribution conditional on uniform may have different characteristics and applications compared to other distributions.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
306
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
11
Views
2K
  • Calculus and Beyond Homework Help
Replies
15
Views
1K
  • Calculus and Beyond Homework Help
Replies
18
Views
1K
  • Calculus and Beyond Homework Help
Replies
20
Views
457
Back
Top