Joint Distribution of Changing Mean

1. Sep 29, 2006

ghotra

Let X be a random variable with mean $$\mu$$ and standard deviation 1.

Suppose $$\mu$$ is randomly distributed about 0 with standard deviation 1.

At each iteration, we select a new $$\mu$$ according to its distributuion. This mean is then used in the distribution for X. Then we pick an X according to its distribution.

My question: What is the resulting joint distribution? Given this joint distribution, I should be able to calculate the mean and standard deviation. Clearly, the mean X will be 0, but what will be the standard deviation of X? It seems that it should, at a minimum, be greater than 1.

Thanks!

Last edited: Sep 29, 2006
2. Sep 29, 2006

mathman

let m=mean

E(X2|E(X)=m)=1+m2

E(m2)=1, since std dev(m)=1 and mean=0

Therefore E(X2)=2 or

std. dev.(X)=sqrt(2)

3. Sep 29, 2006

ghotra

Could you elaborate a bit more? I can confirm that the std deviation is indeed sqrt(2), however, I don't understand where the following formula comes from:

E(X^2 | E(x) = m) = 1 + E(m^2)

From the definition,

$$\sigma_x^2 = E(x^2) - E(x)^2 = E(x^2) - m^2$$

Presumably, I stick your formula into the formula I just wrote above...but I'm still confused where your formula comes from. Also, m (in my original post $$\mu$$) is not the same...it is determined by a normal distribution.

4. Sep 29, 2006

Hurkyl

Staff Emeritus
I think the easiest way to do this is to simplify the description -- X is the sum of two normal distributions. (admittedly, it's good to be able to do it different ways, though)

5. Sep 29, 2006

ghotra

Interesting, I had wondered if that was okay to do...as the variance of X would then be the sum of the variances of the two normal distributions...and this is, in fact, sqrt(2). Could you explain how these are equivalent pictures?

In general, I would like to consider a set of distributions

$$\mu_1 \sigma_1$$

$$\mu_2 \sigma_2$$

$$\mu_3 \sigma_3$$

...

where the $$\mu_i$$ are distributed normally with mean $$\mu$$ and std deviation $$\Delta \mu$$

where the $$\sigma_i$$ are distributed normally with mean $$\sigma$$ and std deviation $$\Delta \sigma$$

For each distribution, we pick x once. What is the expected value of x and what is the standard deviation?

Last edited: Sep 29, 2006
6. Sep 30, 2006

Hurkyl

Staff Emeritus
X is a normal distribution centered about u.

X - u is a normal distribution centered about 0.

X - u is, presumably independent from u. (You never actually specified in your problem that the only dependence of X on u is that u is the mean of the distribution on X, but I assume it was meant)

X = (X - u) + u

7. Sep 30, 2006

mathman

note: (You wrote: E(X^2 | E(x) = m) = 1 + E(m^2). If you look carefully at what I wrote I had m^2 on the right in that line and then took average over m on both sides to get E(X^2).)

In your expression for sig2(x), you implicitly defined it for a specific value of m. In the expression I wrote, I just made it explicit and rearranged terms, while using the fact that the variance of x is 1.

All I assumed about m is that it was random with first and second moments 0 and 1; normal distribution is unnecessary.