Joint Distribution of Changing Mean

Click For Summary

Discussion Overview

The discussion revolves around the joint distribution of a random variable X, which has a mean \(\mu\) that is itself randomly distributed about 0 with a standard deviation of 1. Participants explore the implications of this setup on the mean and standard deviation of X, as well as the relationships between the distributions involved.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant proposes that the resulting joint distribution can be analyzed to find the mean and standard deviation of X, suggesting that the mean will be 0 and speculating that the standard deviation should be greater than 1.
  • Another participant calculates that the expected value \(E(X^2 | E(X) = m) = 1 + m^2\) leads to a standard deviation of \(\sqrt{2}\), but does not clarify the derivation of this formula.
  • A participant expresses confusion regarding the formula \(E(X^2 | E(X) = m) = 1 + E(m^2)\) and seeks clarification on its origin, noting that \(\mu\) is determined by a normal distribution.
  • Some participants suggest that X can be viewed as the sum of two normal distributions, which leads to the conclusion that the variance of X is the sum of the variances of these distributions.
  • Another participant introduces a broader consideration of a set of distributions with varying means and standard deviations, questioning the expected value and standard deviation in this context.
  • One participant asserts that \(X - \mu\) is independent from \(\mu\), although this independence was not explicitly stated in the original problem.
  • Further clarification is requested regarding the relationship between the variance of X and the moments of \(\mu\), with an emphasis on the assumption that \(\mu\) has a normal distribution being unnecessary for the discussion.

Areas of Agreement / Disagreement

Participants generally agree on the calculation of the standard deviation being \(\sqrt{2}\), but there is disagreement and confusion regarding the derivation of certain formulas and the assumptions about independence and distribution types. The discussion remains unresolved on some aspects, particularly regarding the implications of the joint distribution.

Contextual Notes

There are limitations regarding the assumptions made about the distributions involved, particularly concerning the independence of X and \(\mu\) and the necessity of normality for \(\mu\). Some mathematical steps and definitions remain unresolved, contributing to the ongoing debate.

ghotra
Messages
53
Reaction score
0
Let X be a random variable with mean \mu and standard deviation 1.

Let's add a twist.

Suppose \mu is randomly distributed about 0 with standard deviation 1.

At each iteration, we select a new \mu according to its distributuion. This mean is then used in the distribution for X. Then we pick an X according to its distribution.

My question: What is the resulting joint distribution? Given this joint distribution, I should be able to calculate the mean and standard deviation. Clearly, the mean X will be 0, but what will be the standard deviation of X? It seems that it should, at a minimum, be greater than 1.

Thanks!
 
Last edited:
Physics news on Phys.org
let m=mean

E(X2|E(X)=m)=1+m2

E(m2)=1, since std dev(m)=1 and mean=0

Therefore E(X2)=2 or

std. dev.(X)=sqrt(2)
 
Could you elaborate a bit more? I can confirm that the std deviation is indeed sqrt(2), however, I don't understand where the following formula comes from:

E(X^2 | E(x) = m) = 1 + E(m^2)

From the definition,

\sigma_x^2 = E(x^2) - E(x)^2 = E(x^2) - m^2

Presumably, I stick your formula into the formula I just wrote above...but I'm still confused where your formula comes from. Also, m (in my original post \mu) is not the same...it is determined by a normal distribution.
 
I think the easiest way to do this is to simplify the description -- X is the sum of two normal distributions. (admittedly, it's good to be able to do it different ways, though)
 
Hurkyl said:
I think the easiest way to do this is to simplify the description -- X is the sum of two normal distributions.

Interesting, I had wondered if that was okay to do...as the variance of X would then be the sum of the variances of the two normal distributions...and this is, in fact, sqrt(2). Could you explain how these are equivalent pictures?

In general, I would like to consider a set of distributions

\mu_1 \sigma_1

\mu_2 \sigma_2

\mu_3 \sigma_3

...

where the \mu_i are distributed normally with mean \mu and std deviation \Delta \mu

where the \sigma_i are distributed normally with mean \sigma and std deviation \Delta \sigma

For each distribution, we pick x once. What is the expected value of x and what is the standard deviation?
 
Last edited:
X is a normal distribution centered about u.

X - u is a normal distribution centered about 0.

X - u is, presumably independent from u. (You never actually specified in your problem that the only dependence of X on u is that u is the mean of the distribution on X, but I assume it was meant)

X = (X - u) + u
 
Could you elaborate a bit more? I can confirm that the std deviation is indeed sqrt(2), however, I don't understand where the following formula comes from:

E(X^2 | E(x) = m) = 1 + E(m^2)

From the definition,

sig2(x)=E(x2)-m2

Presumably, I stick your formula into the formula I just wrote above...but I'm still confused where your formula comes from. Also, m (in my original post ) is not the same...it is determined by a normal distribution.

note: (You wrote: E(X^2 | E(x) = m) = 1 + E(m^2). If you look carefully at what I wrote I had m^2 on the right in that line and then took average over m on both sides to get E(X^2).)

In your expression for sig2(x), you implicitly defined it for a specific value of m. In the expression I wrote, I just made it explicit and rearranged terms, while using the fact that the variance of x is 1.

All I assumed about m is that it was random with first and second moments 0 and 1; normal distribution is unnecessary.
 

Similar threads

Replies
4
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K
Replies
2
Views
2K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 43 ·
2
Replies
43
Views
6K
  • · Replies 24 ·
Replies
24
Views
4K