Joint Distribution of Changing Mean

AI Thread Summary
The discussion revolves around the joint distribution of a random variable X with a changing mean μ, which is itself randomly distributed around 0 with a standard deviation of 1. The participants confirm that the standard deviation of X is √2, derived from the relationship E(X^2 | E(X) = m) = 1 + E(m^2). They explore the implications of treating X as the sum of two normal distributions, emphasizing that the variance of X is the sum of the variances of its components. Clarifications are made regarding the independence of X from μ and the derivation of the variance formula, with a focus on the moments of the distributions involved. The conversation highlights the complexities of calculating expected values and variances in this context.
ghotra
Messages
53
Reaction score
0
Let X be a random variable with mean \mu and standard deviation 1.

Let's add a twist.

Suppose \mu is randomly distributed about 0 with standard deviation 1.

At each iteration, we select a new \mu according to its distributuion. This mean is then used in the distribution for X. Then we pick an X according to its distribution.

My question: What is the resulting joint distribution? Given this joint distribution, I should be able to calculate the mean and standard deviation. Clearly, the mean X will be 0, but what will be the standard deviation of X? It seems that it should, at a minimum, be greater than 1.

Thanks!
 
Last edited:
Physics news on Phys.org
let m=mean

E(X2|E(X)=m)=1+m2

E(m2)=1, since std dev(m)=1 and mean=0

Therefore E(X2)=2 or

std. dev.(X)=sqrt(2)
 
Could you elaborate a bit more? I can confirm that the std deviation is indeed sqrt(2), however, I don't understand where the following formula comes from:

E(X^2 | E(x) = m) = 1 + E(m^2)

From the definition,

\sigma_x^2 = E(x^2) - E(x)^2 = E(x^2) - m^2

Presumably, I stick your formula into the formula I just wrote above...but I'm still confused where your formula comes from. Also, m (in my original post \mu) is not the same...it is determined by a normal distribution.
 
I think the easiest way to do this is to simplify the description -- X is the sum of two normal distributions. (admittedly, it's good to be able to do it different ways, though)
 
Hurkyl said:
I think the easiest way to do this is to simplify the description -- X is the sum of two normal distributions.

Interesting, I had wondered if that was okay to do...as the variance of X would then be the sum of the variances of the two normal distributions...and this is, in fact, sqrt(2). Could you explain how these are equivalent pictures?

In general, I would like to consider a set of distributions

\mu_1 \sigma_1

\mu_2 \sigma_2

\mu_3 \sigma_3

...

where the \mu_i are distributed normally with mean \mu and std deviation \Delta \mu

where the \sigma_i are distributed normally with mean \sigma and std deviation \Delta \sigma

For each distribution, we pick x once. What is the expected value of x and what is the standard deviation?
 
Last edited:
X is a normal distribution centered about u.

X - u is a normal distribution centered about 0.

X - u is, presumably independent from u. (You never actually specified in your problem that the only dependence of X on u is that u is the mean of the distribution on X, but I assume it was meant)

X = (X - u) + u
 
Could you elaborate a bit more? I can confirm that the std deviation is indeed sqrt(2), however, I don't understand where the following formula comes from:

E(X^2 | E(x) = m) = 1 + E(m^2)

From the definition,

sig2(x)=E(x2)-m2

Presumably, I stick your formula into the formula I just wrote above...but I'm still confused where your formula comes from. Also, m (in my original post ) is not the same...it is determined by a normal distribution.

note: (You wrote: E(X^2 | E(x) = m) = 1 + E(m^2). If you look carefully at what I wrote I had m^2 on the right in that line and then took average over m on both sides to get E(X^2).)

In your expression for sig2(x), you implicitly defined it for a specific value of m. In the expression I wrote, I just made it explicit and rearranged terms, while using the fact that the variance of x is 1.

All I assumed about m is that it was random with first and second moments 0 and 1; normal distribution is unnecessary.
 
Back
Top