Summing X_i with Binomial Distribution: What is the Problem?

  • Context: Graduate 
  • Thread starter Thread starter autobot.d
  • Start date Start date
  • Tags Tags
    Class
Click For Summary

Discussion Overview

The discussion revolves around the problem of summing a series of independent and identically distributed (iid) random variables, \(X_i\), which have a known mean and variance, in the context of a binomially distributed variable \(m\). Participants explore how to calculate the mean and variance of the sum \(S = \sum^{m}_{i=1} X_i\), where \(m\) follows a binomial distribution with parameters \(n\) and \(p\).

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant questions how to work with the sum \(S\) and proposes that \(S\) can be expressed as \(mX_1\) due to the iid nature of \(X_i\), suggesting that the mean of \(S\) could be calculated as \(\bar{S} = np\mu\).
  • Another participant points out that the sum of independent variables does not equal \(m\) times \(X_1\) due to the lack of correlation, and emphasizes the need to correct the calculation steps to avoid issues with variance.
  • A participant raises a concern about the implications of \(m\) being binomially distributed and seeks clarification on its effect on the summation.
  • Another suggests expressing the problem in terms of probabilities for different counts of \(X_i\) being added, and combining variances accordingly.
  • One participant attempts to calculate the probability \(f_0\) for zero \(X_i\) being added and inquires about how to handle the probabilities \(f_0, f_1, \ldots\) in the context of averaging.
  • A later reply confirms that the approach resembles a weighted average and asserts that the same rules apply for expectation values and variances.

Areas of Agreement / Disagreement

Participants express differing views on the correct approach to calculating the mean and variance of \(S\). There is no consensus on how to handle the binomial distribution of \(m\) in relation to the summation of \(X_i\).

Contextual Notes

Participants note that the calculations for mean and variance may depend on the specific properties of the binomial distribution and the iid nature of \(X_i\), but the discussion does not resolve these dependencies or assumptions.

autobot.d
Messages
67
Reaction score
0
What kind of problem is this?
X_i \textrm{are iid with known mean and variance, } \mu \textrm{ and } \sigma ^2 \textrm{respectively. }
m \sim \textrm{Binomial(n,p), n is known.}

S = \sum^{m}_{i=1} X_i

How do I work with this? This what I have thought of.

S = \sum^{m}_{i=1} X_i = mX_1 \textrm{(since iid)}
so for the mean of S
\bar{S} = \bar{mX_i} = np \mu ?
or to find mean of S use expected value
E(S) = E(mX_i) = E(mX_1) \textrm{ (since iid)}
but then what?

Any help would be appreciated. I am guessing this kind of problem has a name?

Thanks.
 
Physics news on Phys.org
I am missing the problem statement - what are you supposed to do? Calculate mean and variance of S?

As the X_i are independent, their sum is not the same as m times X_1 (this would need 100% correlation),
You can use the mean for X_i in your expression for the mean of S. The result for the mean is right, but you have to fix the calculation steps - otherwise you run into problems with the variance.
 
But what about the fact that the m in the summation limit is Binomially distributed? I do not understand what that does? Thanks.
 
You can express this as "f0 probability that 0 X_i are added, f1 probability that 1 X_i is added, ...", calculate the variance in each of those cases, and combine them afterwards.
For the expectation value, this does not matter, you can take the expectation value of both separately as all X_i have the same distribution.
 
so for "f0 probability that 0 X_i are added", and assuming f0 = {0 \choose n}p^0(1-p)^{n}=(1-p)^n

and so on, then what do I do with all the f0, f1, ...
is the average like a weighted average

pretty lost so any help is greatly appreciated. Thanks.
 
It is like a weighted average, indeed, and it follows the same rules for expectation values and variances.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 27 ·
Replies
27
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
4K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
5
Views
6K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K