Joint Distribution of Changing Mean

In summary, we have a joint distribution where X is the sum of two normal distributions, each with mean 0 and standard deviation 1. This results in a mean of X equal to 0 and a standard deviation of X equal to sqrt(2). Additionally, by simplifying the problem, we can see that X is a normal distribution centered around u, with u being the mean of the distribution on X.
  • #1
ghotra
53
0
Let X be a random variable with mean [tex]\mu[/tex] and standard deviation 1.

Let's add a twist.

Suppose [tex]\mu[/tex] is randomly distributed about 0 with standard deviation 1.

At each iteration, we select a new [tex]\mu[/tex] according to its distributuion. This mean is then used in the distribution for X. Then we pick an X according to its distribution.

My question: What is the resulting joint distribution? Given this joint distribution, I should be able to calculate the mean and standard deviation. Clearly, the mean X will be 0, but what will be the standard deviation of X? It seems that it should, at a minimum, be greater than 1.

Thanks!
 
Last edited:
Physics news on Phys.org
  • #2
let m=mean

E(X2|E(X)=m)=1+m2

E(m2)=1, since std dev(m)=1 and mean=0

Therefore E(X2)=2 or

std. dev.(X)=sqrt(2)
 
  • #3
Could you elaborate a bit more? I can confirm that the std deviation is indeed sqrt(2), however, I don't understand where the following formula comes from:

E(X^2 | E(x) = m) = 1 + E(m^2)

From the definition,

[tex]\sigma_x^2 = E(x^2) - E(x)^2 = E(x^2) - m^2[/tex]

Presumably, I stick your formula into the formula I just wrote above...but I'm still confused where your formula comes from. Also, m (in my original post [tex]\mu[/tex]) is not the same...it is determined by a normal distribution.
 
  • #4
I think the easiest way to do this is to simplify the description -- X is the sum of two normal distributions. (admittedly, it's good to be able to do it different ways, though)
 
  • #5
Hurkyl said:
I think the easiest way to do this is to simplify the description -- X is the sum of two normal distributions.

Interesting, I had wondered if that was okay to do...as the variance of X would then be the sum of the variances of the two normal distributions...and this is, in fact, sqrt(2). Could you explain how these are equivalent pictures?

In general, I would like to consider a set of distributions

[tex]\mu_1 \sigma_1[/tex]

[tex]\mu_2 \sigma_2[/tex]

[tex]\mu_3 \sigma_3[/tex]

...

where the [tex]\mu_i[/tex] are distributed normally with mean [tex]\mu[/tex] and std deviation [tex]\Delta \mu[/tex]

where the [tex]\sigma_i[/tex] are distributed normally with mean [tex]\sigma[/tex] and std deviation [tex]\Delta \sigma[/tex]

For each distribution, we pick x once. What is the expected value of x and what is the standard deviation?
 
Last edited:
  • #6
X is a normal distribution centered about u.

X - u is a normal distribution centered about 0.

X - u is, presumably independent from u. (You never actually specified in your problem that the only dependence of X on u is that u is the mean of the distribution on X, but I assume it was meant)

X = (X - u) + u
 
  • #7
Could you elaborate a bit more? I can confirm that the std deviation is indeed sqrt(2), however, I don't understand where the following formula comes from:

E(X^2 | E(x) = m) = 1 + E(m^2)

From the definition,

sig2(x)=E(x2)-m2

Presumably, I stick your formula into the formula I just wrote above...but I'm still confused where your formula comes from. Also, m (in my original post ) is not the same...it is determined by a normal distribution.

note: (You wrote: E(X^2 | E(x) = m) = 1 + E(m^2). If you look carefully at what I wrote I had m^2 on the right in that line and then took average over m on both sides to get E(X^2).)

In your expression for sig2(x), you implicitly defined it for a specific value of m. In the expression I wrote, I just made it explicit and rearranged terms, while using the fact that the variance of x is 1.

All I assumed about m is that it was random with first and second moments 0 and 1; normal distribution is unnecessary.
 

1. What is the meaning of "joint distribution of changing mean" in statistics?

The joint distribution of changing mean refers to the probability distribution of two or more random variables that are dependent on each other, where the mean of at least one of the variables changes over time or in different conditions.

2. How is the joint distribution of changing mean different from a regular joint distribution?

In a regular joint distribution, the mean of each variable remains constant. In the joint distribution of changing mean, one or more of the variables have a mean that varies depending on certain factors or conditions.

3. What are some examples of situations where the joint distribution of changing mean is applicable?

The joint distribution of changing mean can be used to model changes in stock prices, weather patterns, or consumer behavior. It can also be used in medical research to study the effect of a treatment on a patient's health over time.

4. How is the joint distribution of changing mean calculated?

The joint distribution of changing mean is calculated by first determining the individual probability distributions of each variable and then combining them using a joint probability function. This function takes into account the correlation between the variables and the changing mean.

5. What are the advantages of using the joint distribution of changing mean in statistical analysis?

Using the joint distribution of changing mean allows for a more accurate and comprehensive analysis of data, as it takes into account the relationship between variables and how their means change over time. This can provide insights and predictions that would not be possible with a regular joint distribution.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
846
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
912
  • Set Theory, Logic, Probability, Statistics
2
Replies
43
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
842
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
Back
Top