Summing X_i with Binomial Distribution: What is the Problem?

  • Context: Graduate 
  • Thread starter Thread starter autobot.d
  • Start date Start date
  • Tags Tags
    Class
Click For Summary
SUMMARY

The discussion centers on calculating the sum of independent and identically distributed (iid) random variables \(X_i\) under a Binomial distribution framework, specifically \(m \sim \text{Binomial}(n,p)\). The sum \(S = \sum^{m}_{i=1} X_i\) requires understanding the mean and variance of \(S\), where the mean is correctly identified as \(\bar{S} = np\mu\). However, participants emphasize the importance of recognizing that \(m\) is Binomially distributed, which affects the variance calculation. The conversation concludes with the assertion that the expectation value can be treated separately for each \(X_i\) due to their identical distribution.

PREREQUISITES
  • Understanding of Binomial distribution and its properties
  • Knowledge of independent and identically distributed (iid) random variables
  • Familiarity with expectation values and variance calculations
  • Basic probability theory concepts
NEXT STEPS
  • Study the properties of Binomial distribution in detail
  • Learn about the Law of Total Expectation and its application in probability
  • Explore variance calculations for sums of random variables
  • Investigate weighted averages and their relevance in probability theory
USEFUL FOR

Statisticians, data scientists, and anyone involved in probability theory or statistical modeling, particularly those working with Binomial distributions and iid random variables.

autobot.d
Messages
67
Reaction score
0
What kind of problem is this?
X_i \textrm{are iid with known mean and variance, } \mu \textrm{ and } \sigma ^2 \textrm{respectively. }
m \sim \textrm{Binomial(n,p), n is known.}

S = \sum^{m}_{i=1} X_i

How do I work with this? This what I have thought of.

S = \sum^{m}_{i=1} X_i = mX_1 \textrm{(since iid)}
so for the mean of S
\bar{S} = \bar{mX_i} = np \mu ?
or to find mean of S use expected value
E(S) = E(mX_i) = E(mX_1) \textrm{ (since iid)}
but then what?

Any help would be appreciated. I am guessing this kind of problem has a name?

Thanks.
 
Physics news on Phys.org
I am missing the problem statement - what are you supposed to do? Calculate mean and variance of S?

As the X_i are independent, their sum is not the same as m times X_1 (this would need 100% correlation),
You can use the mean for X_i in your expression for the mean of S. The result for the mean is right, but you have to fix the calculation steps - otherwise you run into problems with the variance.
 
But what about the fact that the m in the summation limit is Binomially distributed? I do not understand what that does? Thanks.
 
You can express this as "f0 probability that 0 X_i are added, f1 probability that 1 X_i is added, ...", calculate the variance in each of those cases, and combine them afterwards.
For the expectation value, this does not matter, you can take the expectation value of both separately as all X_i have the same distribution.
 
so for "f0 probability that 0 X_i are added", and assuming f0 = {0 \choose n}p^0(1-p)^{n}=(1-p)^n

and so on, then what do I do with all the f0, f1, ...
is the average like a weighted average

pretty lost so any help is greatly appreciated. Thanks.
 
It is like a weighted average, indeed, and it follows the same rules for expectation values and variances.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 27 ·
Replies
27
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
4K
  • · Replies 6 ·
Replies
6
Views
3K
Replies
5
Views
5K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
3K